Hi there! You are currently browsing as a guest. Why not create an account? Then you get less ads, can thank creators, post feedback, keep a list of your favourites, and more!
Quick Reply
Search this Thread
Test Subject
Original Poster
#1 Old 27th Jul 2015 at 4:27 AM

This user has the following games installed:

Sims 3, World Adventures, Ambitions, Late Night, Generations, Pets, Showtime, Supernatural, Seasons, University Life, Island Paradise, Into the Future
Default Texture Memory 32MB Override - Significant Camera Stuttering GTX 970
Hello All!

I've been lurking and reading through most of the threads about those with Radeon 7xxx series who are getting the texture override line in their deviceconfig, and how they are being fixed. These fixes are not working for me and I'd like to take this time to start a post that will address the issues (hopefully) for all who have Nvidia 9xx and above for once and for all.

My problem is that the game stutters uncontrollably often (NOT a simple microstutter) when panning the camera or zooming in and out AND the game crashes when I am in a lot that has too many furnishings - likely a problem with the game limiting GPU memory usage to 32mb. I have the settings in game set to medium on everything. I am aware that TS3 is buggy, but this goes beyond buggy and I currently have no idea how to show you what I see. My computer specs are below, and I am fairly certain they're not at fault because I can play ENB'd Skyrim at a constant 60 FPS with 2k textures on EVERYTHING (it's glorious btw, you should give it a try!). If someone will kindly explain to me how to put all this into a spoiler text, I'd love to do so in order to shorten this post visually.

CPU: i7 4790k OC'd at 4.5 Cooled with an EVO 212 (disabled onboard graphics)
GPU: MSI Gaming GTX 970 OC'd at 1250 (drivers up to date)
MB: MSI Gaming 7 (firmware up to date)
RAM: 16GB (2x8 Corsair Vengeance 2133)
PSU: EVGA G2 1000W
HD: Crucial MX100 512GB SSD

The following is the Device Config file. I have edited the graphics file to recognize my card, yet GPU Memory remains at 32MB. I had attempted to edit Graphics Rules to force the game to use 2048MB, but performance remained unchanged - so I reverted the changes. However, I am convinced that the game is forcing the card to use ONLY 32MB (whether I wrote 2048MB in the files or not) because my laptop with the much older GT 650M (same mods/CC) runs the game ALMOST perfectly - besides the buggy MICRO stutter.

=== Application info ===
Name: Sims3
Version:
Build: Release
=== Rating info ===
GPU: 5 GPU Memory: 1 CPU: 4 RAM: 4 CPU Speed: 4000 Threading: 3
Adjusted CPU: 4934 RAM: 16332 Adjusted RAM: 15820 Cores: 4
=== Machine info ===
OS version: Windows 7 6.1.7601 Service Pack 1
OS prod type: 0
OS major ver: 6
OS minor ver: 1
OS SP major ver: 1
OS SP minor ver: 0
OS is 64Bit: 1
CPU: GenuineIntel
Brand: Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz
Family: 6
Model: 12
Cores: 4
HT: 1
x64: 0
Memory: 16332MB
Free memory: 10212MB
User: Dani
Computer: RED
=== Graphics device info ===
Number: 0
Name (driver): NVIDIA GeForce GTX 970
Name (database): NVIDIA GeForce GTX 970 [Found: 1, Matched: 1]
Vendor: NVIDIA
Chipset: Vendor: 10de, Device: 13c2, Board: 31601462, Chipset: 00a1
Driver: nvd3dum.dll, Version: 10.18.13.5330, GUID: D7B71E3E-5082-11CF-0063-6D111CC2C735
Driver version: 5330
Monitor: \\.\DISPLAY1
Texture memory: 32MB <<OVERRIDE>>
Vertex program: 3.0
Pixel program: 3.0
Hardware TnL: 1

My theory of what's happening is that because the game is not TRULY recognizing the card beyond a workaround with the graphicscards file, the game is using ONLY 32MB, which explains why if I don't move the camera at all, the game runs fine and lots with many textures crash/ are laggier. It also explains why the GT 650M (a recognized card with proper memory allocation that is almost 4 years old AND a low tier card to begin with) can run the game well on higher settings across the board.

How I think this may be fixed (as fiddling with the numbers in the GraphicsRule files doesn't seem to work) is somehow forcing the game to use more texture memory. One way I thought to do this was to make game think I'm on the GT 650M again, which I have NO clue how to do. Perhaps then, the game will properly utilize a larger amount of texture memory on the GPU?

I look forward to hearing from your expertise! Please let me know if I need to add any information.



P.S. I tried to "Add Bacon." Did it work?
Advertisement
Test Subject
Original Poster
#3 Old 27th Jul 2015 at 5:17 AM Last edited by kofh : 27th Jul 2015 at 6:20 AM.
Quote: Originally posted by nitromon
I've replied to similar threads at least 3 times with instructions on how to fix it.

I guess I should get one of those posts and write it as a tutorial

In the meantime, why don't you use MTS search function and type in Nvidia 970 and see the results.

*sigh* I'm such a nice guy:

Nvidia 970 thread 1

Nvidia 970 thread 2



I am ashamed to say that when I put "970" and "32MB" and "override" into the all knowing google, these results didn't show up OR I just missed them. I accept your snark and sarcasm there. Thank you for bringing those to my attention! I'll try them out now and report back asap.

Edit
I tried the suggestions in both threads - the second one first because the laptop GPU was the same as mine (what a small world!). Unfortunately, I must reduce settings on my GTX 970 rig to the lowest on all settings in order to get ANYTHING close to the smoothness I experience on my laptop (which has the game set much higher).

Furthermore, I compared my gameplay with that of the Intel HD 4000 video you linked to, and my gameplay is about the same, but it's stuttery in an erratic sense, as opposed to the video where it seems the stutter just happens every other second.

Anything else I can try?
Test Subject
Original Poster
#5 Old 27th Jul 2015 at 12:34 PM Last edited by kofh : 27th Jul 2015 at 1:19 PM.
Quote: Originally posted by nitromon
Did you make these changes to graphicsrules.sgr:

change

seti textureMemory 32
setb textureMemorySizeOK false


to

seti textureMemory 1024
# setb textureMemorySizeOK false


---

Basically TS3 doesn't recognize the dedicated memory. This workaround will enable it to use shared memory, which is your RAM.

Also, one of the thread mentioned overheating. The 970 requires an FPS limiter, you can use the Nvidia Inspector for a Nvidia card. Set the FPS limit on TS3 to 60.


Nvidia inspector was great! GPU temps have lowered from 65 to 60. I am even watching the extra sensor monitor as I test the game. I don't know if I can do anything with the information so I'm going to tell you about it, but it makes me feel cool. :D

GPU usage remains at around 28%, but this GPU0- GPU Clk (clock?) maxes out a lot at 1113MHz but goes to the full 1250 when I window mode the game.

And yes, I added the pound and changed the 32 to 1024. Then I changed the 1024 to 2048, then to 3000 then to 4040 (which is interestingly enough what my laptop has it set at), and the desktop still performs worse than the laptop at each setting.

It's interesting that you say the game doesn't use dedicated memory (which I assume you mean VRAM), and instead uses the RAM. Maybe my issue lies with my RAM? I have not overclocked the ram, (though desktop base clock speeds should be faster than laptop nonetheless) and I will give that a try now. On a side note, I also watched RAM usage for my sims house on the desktop, and it never broke 1.38 GB ish ram despite my allowing it to go over 2GB. The laptop, however, hovers around the same RAM usage at 1.41GB ish. Here's for the best!

EDIT

So I OC'd the ram (via MSI's MB easy XMP button), and nothing in game seems to work ANY differently. I'll try doing it manually as well, but I'm running out of hope.
Test Subject
Original Poster
#7 Old 27th Jul 2015 at 3:18 PM Last edited by kofh : 27th Jul 2015 at 4:11 PM.
This is starting to get real technical so I'm going to slow down for this.

Quote: Originally posted by nitromon
Basically, the issue has to do with the card being too new or some model that TS3 can't recognize the texture memory, so it does 2 things, sets the VRAM flag to "not ok" and then uses 32mb RAM in software texture rendering. This is why the game is laggy and crappy. So the 2 lines make the changes: first it changes the VRAM flag back to "ok" and then tells TS3 to use your GPU's shared memory capability. So you cannot really set this number higher than what your GPU's shared memory capability. I don't know what it is for the 970, but for 650m it is 2048 and some cards only support 1024.

I did those two changes, with barely a change in actual performance. I set the 32MB to 1024MB then 2048MB then 4040MB (4040 because laptop TS3 device config shows this number) to see if anything is different between 1024/2048/4040. The result is nothing changed. I don't THINK there is a different between 1024MB and 32MB either but I'll try those two changes again.

I attempted to google "GPU Shared Memory 970" and came up with something that I think is irrelevant to what you're talking about because it says that shared memory is a "dedicated" 64KB. It's here. That page talks about the 980 and says that "larger, dedicated shared memory" of "96kb". However, based on your statement of the 650M being 2048 (the card is indeed 2GB) I would assume you're talking about GPU VRAM, of which the GTX 970 has 4 GB (3.5 of which is gDDR5 and .5 is apparently shitty).

Quote: Originally posted by nitromon
However, setting this number doesn't mean the game will use that much. As far as I know after testing TS3 several times, is that it will never use more than 800MB of VRAM, so setting it higher than 1024 won't really change anything or help improve the game.


You stated this in the other thread about the 970 and 650M. I had to try higher values though, just in case. Especially since device config on the pretty much fluid TS3 laptop has texture memory at 4040 on the 650M for some reason, despite only 2GB rated card.

My laptop is this exact one here .
i7-3615/GT650m 2GB/1TB HD/8GB RAM

Quote: Originally posted by nitromon
So maybe the memory speed does make a big difference, iono? But to fix this, you'll have to contact Nvidia since EA is no longer supporting TS3.

(By the way, this is all theoretical. I've been asking someone to confirm this for me. If you are using GPU-Z, can you confirm this? It will tell you in the sensor section whether it is using shared or dedicated memory)


When you say memory speed do you mean "CPU Core Clock" or "GPU Memory Clock" or perhaps something else?

In this link is my GPU-Z screen shot, which I did with TS3 ALT TABBED. I've no idea how to check if shared memory is being used. Attached to this post is also a screenshot with TS3 CURRENTLY PLAYING and more monitors. I don't know why TS3 is completely black, but imagine a really cool lot with cool sims. Furthermore, I ran DxDiag because the internet suggested that doing so may yield a GPU's shared memory, and below is what I thought to be relevant.
Display Memory: 3726 MB
Dedicated Memory: 4008 MB
Shared Memory: 3814 MB

Quote: Originally posted by nitromon
Remember in window mode, your GPU is only working typically 1/2 as hard as fullscreen. To get full performance out of your GPU, you want to play in fullscreen. To reduce load and stress on your GPU (and CPU), play in window mode.

If you have FPS limit to 60, running with the memory fix, and Vsync+triple buffering on; the game should be pretty smooth. I'm not sure what laptop you are comparing it to, can you put up the specs of that laptop?

EDIT: I just did a quick Google on the 970, apparently this card has a very different memory architecture than other Nvidia cards, perhaps why it is not recognized. I don't recall other Nvidia cards having this issue.


I do run the game in fullscreen with FPS limit 60, memory fix (at 4040), and Vsync/Tripbuff. I want to assume this is a memory architecture issue but nobody else with a 970 seems to be complaining of stutters as bad as I. As soon as I set up my dropbox, I'm going to use a cellphone recording to show you the stuttering differences between the laptop (specs posted above) and the desktop. I hope this post was clearer than the last!
Screenshots
Test Subject
Original Poster
#9 Old 28th Jul 2015 at 2:01 AM
Quote: Originally posted by nitromon
Your laptop is very similar to mine, I'm running an i7-3720q with the 650m, it is pretty nice. So your desktop with 4GHz CPU and 970 cannot match this laptop? Hmm that is a problem. People in the past with 970 issues said the fix worked, your problem then might be something else.


I had the suspicion that these people just didn't have a fluid version to compare the game to.

Quote: Originally posted by nitromon
Have you tried the game on your Intel HD GPU and compare the performance? It will be slower of course, but it shouldn't have those stutteriness. This will help you identify if TS3 itself is the problem.


The HD GPU is disabled, and I actually have no idea how enable it. I plugged in the GPU before ever turning the computer on. I'll look into it.

Quote: Originally posted by nitromon
From your GPU-Z screenshot, the 970 seems really different than other Nvidia cards. This is actually the first time I see one that only says "Memory used" instead of 2 separate sensors for Memory Usage (Dedicated) and Memory Usage (Dynamic). I think this is what is confusing TS3 and unfortunately, doesn't answer the question for us which memory it is really using.


Any suggestion for what else I can do to find out?

Quote: Originally posted by nitromon
btw i just noticed this:
GPU: MSI Gaming GTX 970 OC'd at 1250 (drivers up to date)

I assumed you've also tried TS3 without OC the GPU?


I used the MSI gaming App and took it off "OC Mode" and put it on "silent mode" but I have to say it did not occur to me to manually shut off OC. I'll try that.
Instructor
#11 Old 28th Jul 2015 at 7:33 AM Last edited by PapaEmy : 28th Jul 2015 at 2:40 PM.
Quote: Originally posted by nitromon
btw i just noticed this:
GPU: MSI Gaming GTX 970 OC'd at 1250 (drivers up to date)

I assumed you've also tried TS3 without OC the GPU?


This doesn't matter actually as long as TS3 recognized the 970.

Quote: Originally posted by nitromon
I'm not sure how dual graphics work on a desktop though, I'm guessing you might have to plug your monitor to another socket?


You can't have dual graphics setup if the onboard gpu is from the motherboard (except for Hybrid SLI or Crossfire in Vista era), when discrete video card is installed, it's automatically disabled your onboard gpu. but if your onboard gpu is from the CPU like the latest Intel or AMD APUs chipset, then some will run like dual switchable gpus like in the laptops system

Quote: Originally posted by kofh
CPU: i7 4790k OC'd at 4.5 Cooled with an EVO 212 (disabled onboard graphics)


I assumed you disabled this from BIOS. Don't enabled this for TS3, FPS limiter for TS3 crashed when dual switchable GPUs is enabled, you might better run FPS Limiter from Nvidia Inspector whether you enabled or disabled it, because actually TS3 runs fine with dual switchable GPUs, it just the FPS limiter won't for TS3 (the ones from MATY with 3booter or the FPS Limiter 02). just make sure you have FPS Limiter running to run T3 otherwise you'll cook your 970.

Quote: Originally posted by kofh
=== Graphics device info ===
Number: 0
Name (driver): NVIDIA GeForce GTX 970
Name (database): NVIDIA GeForce GTX 970 [Found: 1, Matched: 1]
Vendor: NVIDIA
Chipset: Vendor: 10de, Device: 13c2, Board: 31601462, Chipset: 00a1
Driver: nvd3dum.dll, Version: 10.18.13.5330, GUID: D7B71E3E-5082-11CF-0063-6D111CC2C735
Driver version: 5330
Monitor: \\.\DISPLAY1
Texture memory: 32MB <<OVERRIDE>>


Let alone the 32MB override for now, we'll fix this later. I see your driver is newer than mine, so it should be fine then.

What I don't understand, how did TS3 recognized your GTX970 as 13c2? Is 13c2 really for GTX970? Because there's no GTX970 in GraphicCard.SGR file. The highest one in there was 1080 for GTX580. I would suggest you to change 0x06cd to 0x13c2 (if 13c2 is really for GTX970), because 06cd is GTX470 which is the same class as GTX970 only older. TS3 don't have graphic cards database newer than the GTX660M or higher than the GTX580, some of this issue can be fix with newer NVidia driver as in my GTX680's case and some needs editing in GraphicCard.SGR file to get it recognized. By putting your Device ID to GTX470 specification, it will make your GTX970 works as GTX470. After you done this, restart the game and then exit, then try to open your DeviceConfig and see what's changed.

It should be like this:

Name (driver): NVIDIA GeForce GTX 970
Name (database): NVIDIA GeForce GTX 470 [Found: 1, Matched: 1]

Now, with that modification, TS3 will recognized your video card, but as GTX470 not as GTX970, and this should changed the Texture Memory automatically according to its database for GTX470. In case it didn't changed it, then there's no other way but to change it manually, but at least you get TS3 to recognized it and proper running of your graphic card in a safe way.

Editted: PS: As for the game crashes, lags or freezes, that has nothing to do with your video card, you might want to check your CC and mods related subject IMO
Test Subject
Original Poster
#12 Old 29th Jul 2015 at 3:03 PM
Quote: Originally posted by PapaEmy
What I don't understand, how did TS3 recognized your GTX970 as 13c2? Is 13c2 really for GTX970? Because there's no GTX970 in GraphicCard.SGR file. The highest one in there was 1080 for GTX580. I would suggest you to change 0x06cd to 0x13c2 (if 13c2 is really for GTX970), because 06cd is GTX470 which is the same class as GTX970 only older. TS3 don't have graphic cards database newer than the GTX660M or higher than the GTX580, some of this issue can be fix with newer NVidia driver as in my GTX680's case and some needs editing in GraphicCard.SGR file to get it recognized. By putting your Device ID to GTX470 specification, it will make your GTX970 works as GTX470. After you done this, restart the game and then exit, then try to open your DeviceConfig and see what's changed.

It should be like this:

Name (driver): NVIDIA GeForce GTX 970
Name (database): NVIDIA GeForce GTX 470 [Found: 1, Matched: 1]

Now, with that modification, TS3 will recognized your video card, but as GTX470 not as GTX970, and this should changed the Texture Memory automatically according to its database for GTX470. In case it didn't changed it, then there's no other way but to change it manually, but at least you get TS3 to recognized it and proper running of your graphic card in a safe way.


This link says the device ID is 13c2 (I edited graphicscards.sgr), and my device manager page seems to agree. I now edited graphicscard.sgr to recognize my the 970 as the 470 - after undoing the manual texture memory edit in graphicsrules.sgr - and texture memory is 32MB override again.

Quote: Originally posted by PapaEmy
Editted: PS: As for the game crashes, lags or freezes, that has nothing to do with your video card, you might want to check your CC and mods related subject IMO


I assumed I wasn't really affected by this because I basically have only really really popular mods like Nraas, awesome, sliders and one skin - no CC at all. My game crashed before due to bad CC, but I've since removed it all. I am now with only mods, and ironically, my laptop TS3 has a TON of CC installed. The game freezes in the sense that water and music keeps running but the sims stop moving. I'm going try a fixed world soon for the freezing, but I can't imagine the stuttering going away with a fixed world.

For the sake of being exhaustive, I'll try running the game without anything in the mods folder at all.
Back to top