• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

Dev should be optimizing for all vendors, especially the one with 85% market share of whom many will have bought your game. Devs have optimized for amd only, hell, arc cant even boot it, digraceful.
this is a sponsered game, why is that so hard to understand, we now see where the sponsorship went, not gimping, just tell devs, we have this arch, can you make sure to fully use it, heres financial compensation for it.

Bethesda has their money already.

By the way, AMD has the larger gaming share, this game isn't only on PC.
 
Last edited:
Dev should be optimizing for all vendors, especially the one with 85% market share of whom many will have bought your game. Devs have optimized for amd only, hell, arc cant even boot it, digraceful.
Maybe nvidia execs are too busy diving into piles of cash from Ai sales, Scrooge McDuck style to tell department heads to work with Bethesda on optimising the game?
 
No ticket required, It's all around the web how **** starfield runs on geforce, even digital foundry hinting at something dodgy going on with the amd sponsorship. I expect we will see a big fps boost for GeForce soon enough. I bet when not gimped, nvidia take the lead in performance.
 
Last edited:
Because when nvidia isn't in the lead, it has to be foul play, obviously.. :rolleyes: Maybe their drivers for it just aren't up to snuff.

It's fine, fine wine is coming

Its obviously an Nvidia issue not the game, some Nvidia GPUs aren't running at full power in this game, as in power draw and that's not the game's fault that's a driver issue or cpu bottleneck. The 4090 for example only uses 250w of its 450w budget when playing Starfield, it's running under its full potential and Nvidia needs to fix that


But that doesn't mean the game doesn't have its own issues. For example the game's file I/O system is pretty bad and the game needs direct storage

 
Last edited:
I guarantee you it won't be seen as fine wine by a few people ;) :p

It definitely isn't a CPU bottleneck but more something where power usage isn't correct, my gpu is constantly pegged at 99% but power is low.




One problem that I have with a lot of these benchmarks and comparisons is rebar, a lot of sites say they turn it on but reality is, it isn't turned on as nvidia have a whitelist for it, which is basically only several games.... Some of these differences that you can see in amd favourable games are diminished a good chunk when rebar is forced on via nvidia profile inspector e.g. dead space remake


And now starfield:


I'm getting a nice little boost overall in starfield. A nvidia rep said they will be enabling rebar in their next update but I think there is more to be done from nvidia AND beth side regarding optimisation for intel and nvidia gpus.
 
Last edited:
It could be the I/O system

There appears to be a correlation between storage drive load and frame drops, so perhaps it's affecting the overall system performance as well. The game urgently needs direct storage, while Playing Starfield the CPU and storage are almost constantly trying to decompress files and can't keep up

This is using a 7700x and Gen 5 SSD and they can't keep up, both more or less pegged to 100% load causing loss of framerate and stutters

 
Last edited:
It's fine, fine wine is coming

Its obviously an Nvidia issue not the game, some Nvidia GPUs aren't running at full power in this game, as in power draw and that's not the game's fault that's a driver issue or cpu bottleneck. The 4090 for example only uses 250w of its 450w budget when playing Starfield, it's running under its full potential and Nvidia needs to fix that


But that doesn't mean the game doesn't have its own issues. For example the game's file I/O system is pretty bad and the game needs direct storage



But according to the purveyor of doom 'evilcrow1001' digital foundry are hinting there's shenanigans afoot! Guess we'll have to wait and see..
 
That’s not what i mean.

It’s the only game i’ve seen so far use lower power from Nvidia gpus while usage is at max. Eg: gpu usage is 99% but using 70% power in watts. So it’s crippled intentionally or by a huge bug.

Everyone noticed this, it’s not like it’s an isolated case either. Hence the unusual disparity between same tier cards from nv vs amd.

Im not bothered since im on a 4090 with framegen on, but i find it rather crappy nonetheless.
There will be lots of optimisation in the coming months from each vendor I have no doubt, but the creation engine is showing its age and should have been reconsidered, especially as the dev has ID as a subsidiary, even without Carmack the expertise is huge
 
But according to the purveyor of doom 'evilcrow1001' digital foundry are hinting there's shenanigans afoot! Guess we'll have to wait and see..
If shenanigans are not afoot who wsnts to be known as a lazy dev who CBA. No one I would guess, this may be Todd’s ‘don’t you have mobiles’ moment and someone who Does GFA may take a more leading role at Bethesda after this. This is after his hubris/side project.
 
Another potential reason why amd cards are faster

When using an AMD card, fewer details in the Skybox are rendered

#LetTheControversyCook


Big oooff, It's like that controversy a few years back where Nvidia was apparently using drivers to lower image quality to get better benchmark scores in reviews.
 
Last edited:
I seen that posted on reddit a week back but it was only one or 2 guys so didn't think much of it but indeed, big oooooooooof, amd cheating? They would never! :p ;)

More than likely just a bug though. For example, if you force 16x AF in NVCP, the shadows are bugged unless you clear the game shader cache.

Big oooff, It's like that controversy a few years back where Nvidia was apparently using drivers to lower image quality to get better benchmark scores in reviews.

Haha yup exactly, amd fans will probably say this is a "feature" :p

I do remember nvidia gpus having issues with the first doom remake where it was missing particle and some lighting effects but generally this whole nonsense of one has better visuals or one displays colour is complete and utter bs and been debunked many times now, ok yeah the monitor colours thing might be valid if you don't get in and change the default settings for nvidia but who doesn't do this? :o People also tried using HZD to show nvidia rendering textures as you move about, which also was debunked as it also happened on amd gpus :cry:

Thing is though, dlss 3.5/ray reconstruction is going to make it extremely hard for benchmark comparisons now though as the ray reconstruction will be offering superior IQ whilst maintaining the same or better performance, AMD need to get something like this asap.
 
Big oooff, It's like that controversy a few years back where Nvidia was apparently using drivers to lower image quality to get better benchmark scores in reviews.
or when nvidia sponsored games had loads of tessellation /geometry detail knowing AMD cards weren't as good at it
 
Last edited:
It could be the I/O system

There appears to be a correlation between storage drive load and frame drops, so perhaps it's affecting the overall system performance as well. The game urgently needs direct storage, while Playing Starfield the CPU and storage are almost constantly trying to decompress files and can't keep up

This is using a 7700x and Gen 5 SSD and they can't keep up, both more or less pegged to 100% load causing loss of framerate and stutters
Odd behaviour for sure. I would have though when it comes to decompression that would give AMD CPU's a massive advantage over Intel given Ryzen's performance edge in applications like 7Zip etc. I guess there must also be a memory bottleneck or the infinity fabric isn't able to keep up with level the instructions that are demanded by the softeware.
 
Got this from reddit about why Starfields perf is a little poo.

Vkd3d (the dx12->vulkan translation layer) developer has put up a change log for a new version that is about to be (released here) and also a pull request with more information about what he discovered about all the awful things that starfield is doing to GPU drivers (here).
Basically:
  1. Starfield allocates its memory incorrectly where it doesn't align to the CPU page size. If your GPU drivers are not robust against this, your game is going to crash at random times.
  2. Starfield abuses a dx12 feature called ExecuteIndirect. One of the things that this wants is some hints from the game so that the graphics driver knows what to expect. Since Starfield sends in bogus hints, the graphics drivers get caught off gaurd trying to process the data and end up making bubbles in the command queue. These bubbles mean the GPU has to stop what it's doing, double check the assumptions it made about the indirect execute and start over again.
  3. Starfield creates multiple `ExecuteIndirect` calls back to back instead of batching them meaning the problem above is compounded multiple times.
What really grinds my gears is the fact that the open source community has figured out and came up with workarounds to try to make this game run better. These workarounds are available to view by the public eye but Bethesda will most likely not care about fixing their broken engine. Instead they double down and claim their game is "optimized" if your hardware is new enough.
 
Last edited:
Another potential reason why amd cards are faster

When using an AMD card, fewer details in the Skybox are rendered

#LetTheControversyCook


That is on planets where you can see stars. Interesting how its Wccftech again - didn't they start the DLSS controversy too? This was a bug reported 4 days ago. Maybe MLID is correct,that Nvidia CBA now and would rather play the sympathy card.

Yet in New Atlantis during the daytime,with no stars,etc my mates RX6700XT is still faster than my RTX3060TI. You can look at the ground and still its faster. You can go into The Well,which is an interior space and still the same. Unless you think stars are rendering inside buildings. So it can't be a lack of stars then.

We compared performance using the exact settings and I saw him play it too and we have the same CPU,same RAM settings,etc. We have a similar speed SSD too.

Nvidia quite clear CBA with releasing drivers for this game - AMD has had several releases. Nvidia hasn't. The last release for my card is three weeks ago.


Because when nvidia isn't in the lead, it has to be foul play, obviously..
:rolleyes:
Maybe their drivers for it just aren't up to snuff.
BTW @keef247 actually played the game on July Nvidia launch drivers for their RTX4070. They updated to the latest "Game Ready" drivers and there was no change in performance for them:

I did make this clear from the day 1 pre-order it ran fine and how I personally had forgotten to even update my drivers since installing the card on the 26th july :cry: I did update them but it made no difference, which means those early 4070 drivers have been pretty decent as I've yet to have any issues in ANY game out of the 30-40 I've thrown at it since I finished building the rig on the 26th of july :D

So maybe Nvidia needs to fix its drivers too. But then OTH they seem fine with their card,and even if my RTX3060TI could do better,I can just drop a few settings and its fine enough to complete the game with.

I expect when Cyberpunk 2.0 runs like crap on AMD/Intel hardware it will be down to rubbish AMD hardware,drivers,poor dev relations,etc. When you switch on RT in Cyberpunk 2077 1.0,an RX7900XT goes from RTX3090 level in RT to the level of an RTX3070TI.

I wonder if my RTX3060TI will be quicker in Cyberpunk 2.0?! :cry:

#Totally normal behaviour.
relative-performance-rt-1920-1080.png

rt-cyberpunk-2077-1920-1080.png
 
Last edited:
Back
Top Bottom