August 2022
Hi.
I am trying to alter the sgr file for the game so it can recognize the graphic information. However, I am not sure how (or where) to change it since the numbers don't go as low as '1636'. I also need for the texture memory to be expanded beyond 32mb. Below is the info for the device. Any help will be greatly appreciated.
GPU: 5 GPU Memory: 1 CPU: 4 RAM: 4 CPU Speed: 3294 Threading: 3
Adjusted CPU: 3992 RAM: 15789 Adjusted RAM: 15277 Cores: 16
=== Machine info ===
OS version: Windows Vista 6.2.9200
OS major ver: 6
OS minor ver: 2
CPU: AuthenticAMD
Brand: AMD Ryzen 9 4900H with Radeon Graphics
Family: 15
Model: 0
Cores: 16
HT: 0
Memory: 15789MB
Free memory: 11579MB
=== Graphics device info ===
Number: 0
Name (driver): AMD Radeon(TM) Graphics
Name (database): AMD Radeon(TM) Graphics [Found: 0, Matched: 0]
Vendor: ATI
Chipset: Vendor: 1002, Device: 1636, Board: 01231002, Chipset: 00f0
Driver: aticfx32.dll, Version: 30.0.13023.4001, GUID: D7B71EE2-5576-11CF-137C-FC135FC2D335
Driver version: 4001
Monitor: \\.\DISPLAY1
Texture memory: 32MB <<OVERRIDE>>
Vertex program: 3.0
Pixel program: 3.0
Hardware TnL: 1
August 2022
Hey @mw1525,
I've moved your post to a more relevant board than EA General Questions.
August 2022
August 2022
@mw1525 So this computer doesn't have a dedicated graphics card? That would surprise me given how powerful the processor is, but I guess it's possible. The easiest way to check is through the Device Manager: click Windows key-X, open the DM, expand the Display adapters section, and see whether a second graphics card is listed. In that case, the first step would be to force your computer to use that card when running Sims 3.
If the computer does in fact only have the integrated chip, then here's how to get it recognized:
Open graphicscards.sgr, and right at the top, under the vendor "ATI" line, copy and paste this:
card 0x1636 "AMD Radeon RX Vega 8 Graphics"
Make sure the entry lines up with the ones below it; I believe it's one Tab in from the left. Save, quit, open graphicsrules.sgr, and crtl-F to search for x18 , which will take you to this line:
elseif (match("${cardName}", "*x18??*") or match("${cardName}", "*x19??*") etc.
Change the bolded x18?? to Vega 8 , but leave everything else intact, including the asterisks. This will rate your graphics chip as high. I don't think it should be rated as uber, but if you prefer, just change the last entry immediately above this one instead; the Uber cards are in the first section, and the list decreases from there.
For the texture memory, find these lines right at the top of graphicsrules:
if ($textureMemory == 0)
seti textureMemory 32
setb textureMemorySizeOK false
Change the 32 to 1024 and put a # and a space in front of setb. Save, quit, launch the game, quit, and check deviceconfig again.
August 2022
August 2022
@mw1525 To add a video card to the computer you need it to have a POCIeX16 slot and I don't see one there in the specs. You have to get your computer to tell the game EXACTLY what the graphics chip is even though it is part of the cpu package. Once you get the game to "found 1" then you need to tell the game the graphics chip capabilities are. The game asks the computer what the video chip is. What the computer returns is what you need to use in the first .sgr file. This has to be in a format that exactly matches what the computer says it has. Once the game finds the chip you have to tell the game it capabilities are in the next .sgr file and that has to be in a format the game can understand. I use AMD cpus but none of them have an integrated video chip. If you look at the other entries you can see things like ? is like a wild card. You have all the data you need - you just have to format it correctly. hth
August 2022
@mw1525 I would love to hear how Sims 3 runs on this system. As you said, its graphics capabilities are very similar to what your laptop has, and the info you provided a year ago was very helpful. Among other things, I linked your thread to someone who was looking at similar systems and was on a budget, and they've since bought a laptop with the same graphics chip. They're quite happy with the outcome. So it would be great to another reference from someone who knows Sims 3 as well as you do.
And you're right, your only option for a dedicated card here would be an external one, plugged into the fastest USB port. But that's not a great option for a couple of reasons. The ifrst is price: the external enclosure the card would need can cost around $250. The second is that eGPUs perform significantly worse than their internal counterparts, so you'd need a faster card than Sims 3 requires otherwise.
Anyway, let me know if you have more questions, and I'm looking forward to whatever info you have to share.
August 2022
August 2022
@mw1525 The # and space in front of setb is code-speak for "ignore this line." In this case, that section of graphicscards.sgr says, by default:
If you [game] don't detect any video memory
set the texture memory to 32 MB
set texture memory detected to false [the result is the OVERRIDE]
The edits make the second line say 1024 MB and void the third line enirely. So I can't think of any way that the edits you made to this section would have the effect you're describing.
What might help is instead running the GPU Add-on Support tool from NexusMods. You'd need a (free) account to download it, and it would make several changes to graphicsrules, not just the ones you did. I've heard from other players with AMD cards, mostly dedicated cards and definitely not your model of iGPU, that this works better than the usual edits, although most of them have been complaining about missing shadows and other details rather than an inability to zoom in.
The tool gets updated all the time, apparently, and I haven't kept up, so I don't know what current changes there are aside from adding more cards to the files. But one or another of those edits may fix your issue as well.
The other thing you could try is copying the options.ini from your laptop to this machine. You'd need to replace the lastdevice line from the old file with the line from the new one—this lists the graphics driver, and when it doesn't match what the game detects, the entire options.ini gets reset. But everything else should work, and maybe that would restore the missing in-game options.
August 2022 - last edited August 2022
@mw1525 My miss print - a standard video card whether half height or regular requires a PCIe slot. Preferably a PCIe X16 slot to take full advantage of the video card. Your computer does not have a PCIe slot. Any other way of connecting a off the shelf video card to that machine is uncharted territory of unsupported hardware modifications NOT supported by the manufacturer. In the good old days this was not uncommon as as schematics were usually provided (or obtainable) from the manufacturer., I liked the one I saw that used the M.2 slot but that would require converting the M.2 slot to PCIe and an external power supply for the video card and running the computer without the cover unmodified. IMO the manufacturer would not support this mod for various reasons. I have a Dell 660s (SFF) which DOES have a PCIe slot and works just fine with a half height video card that doesn't exceed the 220W power supply. I use it for a media client - not gaming. Your situation is quite different. My old gaming machine has a AMD FX6100 cpu that does not have an integrated video chip and my new one in progress has a AMD 5600X cpu which does not have an integrated video chip. Your computer can not have an off the shelf video card because the manufacturer did not engineer a proper interface. What this means you are expected to use the integrated video chip in the AMD processor. At the time of the release of Sins 3 I believe AMD did not have a cpu with integrated graphics and they acquired ATI some time later. I would be very interested in the AMD Ryzen 9 4900H processor temperature after playing some 3D graphical games for a while.