Seems people see what they want to see
Pretty much this.
Even consoles are having some frame latency issues as "evidenced", heck the developers have even acknowledged the issues on PC and are actively working on resolving the issues....
Also love how certain people selectively nit pick certain figures to try and prove their point and conveniently ignore the other stats which throws their argument out the window
Even though it's rather pointless, here are some more benchmarks from other sources for people who want to see the full picture....
https://www.pcgameshardware.de/Deat...rk-Test-Review-Systemanforderungen-1379400/2/
Sadly seems they only enabled ray traced shadows and not ray traced ambient occlusion for that though....
https://www.dsogaming.com/pc-perfor...p-ray-tracing-amd-fsr-benchmarks-comparisons/
They only have a 3080 and 6900xt.
But yeah, looks like having "only" 10gb is causing all kinds of issues eh
Too much entitlement from PC gamers.
They buy a game designed for 16gb memory on next gen consoles then complain when their PC runs out of vram on a 10gb card. Oh boo hoo.
If you can't even match the console's specs then you can't complain about console games not running as they do on console
I'll bite....
Firstly, consoles don't have 16GB "dedicated" for vram, it is shared memory.
We still haven't seen an example where 3080 etc. owners have had to turn down settings because of "vram" other than a very select few trying to run a game at some ridiculous VR resolution or/and using like 50+ mods (something which consoles can't handle/cope with in the same way), any claims surrounding certain games i.e. resident evil village and godfall have been proven wrong and in fact, once ray tracing is turned on, amd cards actually suffer with performance because of lack of RT grunt, of course, now that they have FSR, it's a not as much of a problem.
As shown by digital foundry comparisons, consoles still have reduced settings or/and adaptive resolution scaling in order to hold 4k @ 60, where as most high end gpus don't need to compromise to the same extent.
Throw in ray tracing (and very limited ray tracing at that) and consoles have to pick between ray tracing or/and high resolution + 60fps. This is extremely evident from the metro enhanced comparison, even look at the best ray traced game the ps 5 has and all the compromises it still has to make (spiderman) to achieve ray traced visual fidelity.
I had to turn down settings on day one of owning my 3080 whilst using less than 6GB of vram.
I didn't expect the GPU to be strong enough to max out all my games to begin with, so no big deal. I may someday need to lower settings due to running out of vram, but I have already lost my max-all-the-things virginity, so there won't be a lot of fanfare if/when that happens.
This guy gets it.
Doesn't matter what GPU it is, everyone is going to have to turn down settings at some stage, be that related to the vram, ray tracing or/and sheer grunt required.
Same could be argued for amd gpus and having to turn down/off graphical features, since day 1, they have had to basically turn ray tracing off due to lack of grunt and not having any FSR for a good 8/9 months meant it was a complete no go in any games, which had ray tracing, now they have FSR, which means ray tracing is a bit more usable but still, it has to be carefully fine tuned those RT settings on amd gpus but guess that isn't such a big deal eh???
I personally would sooner turn down other settings than turn off/reduce ray tracing settings, cyberpunk, control, metro, the ascent etc. has completely spoiled me for visuals, everything else just feels "last gen" now.