Being a 'one trick pony' worked out quite well for me to be honest.wow lol, bit of a biased list
though I must commend you for sticking to your guns even through the times when anything amd made was pure rubbish next to Nvidia, I couldn't do that
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Being a 'one trick pony' worked out quite well for me to be honest.wow lol, bit of a biased list
though I must commend you for sticking to your guns even through the times when anything amd made was pure rubbish next to Nvidia, I couldn't do that
Then you can always throw in the fact that "ultra/epic/very high" settings generally bring little to nothing for a huge perf hit. so essentially if you adjust settings properly in the first place and know what the limits of said card is, you can quite easily make a product last much longer.
This is good what HU advised and sensible tbh. The games tend to not offer great scaling options and as they have shown nightmare or whatever the highest setting is named in the menu's seems to barely differ from say ultra. Environment settings seem to be a good one for clouds, reflections, distance so its down to how real you want it to look and adjust for your own experience.
The thing is with this flexibility, you ruin the point in benchmarking - the control aspect in any test/experiment is keeping all conditions the same. If users drop settings to gain higher fps, then surely they wont run into issues others have/experience which is where the comparisons are getting woolly.
I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols![]()

I looked forward to getting a 10gb 80 to find out for myself and got one just before they upgraded it to 12gb.I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols![]()

I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols![]()


To be fair it ain't a free upgrade, that extra 2gb costs more. As far as I know there is not FE for that version. If money ain't an issue then I suggest people go for the 3090. 24gb thenI looked forward to getting a 10gb 80 to find out for myself and got one just before they upgraded it to 12gb.![]()


I was expecting that from the 3000 series but was disappointed. Would not be surprised if we get 2x RT performance on 4000 series. Plus you know what nvidia are like, as soon as it is out the 3000 series won't get any further love and all work on drivers to squeeze things out from game ready drivers will go to the 4000 series. Plus much more fun getting latest gear each gen. It does not have to cost the world either when you sell your old card
I read an article earlier where nvidia have found a way to be able to get 2x extra ray tracing performance so looking forward to the 4070/4080

https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/Ray tracing costs even more power than before
Ray tracing has always cost a lot of power in Cyberpunk 2077, regardless of the hardware, and that hasn't changed. [...]
So it's not surprising that a rendering resolution of 2,560 × 1,440 is too much even for the fast Ampere model. There is no more than 39 FPS even with medium ray tracing details, with high ray tracing details it is only 51 percent - because the GeForce RTX 3080 also runs out of 10 GB of memory. As a result, you no longer have to look at Ultra HD without DLSS, which is already completely unplayable with RT on “Medium”.


I look forward to getting a 16GB card. As soon as I do I plan on coming in here to say 10GB is not enough for the lols![]()

Damn, even Nvidia sponsorships aren't enough to keep the vram thirst at bay (at least not past the launch window winkwink nudgenudge):
https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/
inb4 "bUt yOu dOn'T uNdersTanD, thE gAmE is unOptimisEd!1!!1"
![]()

Speaking of unplayable: ray tracing on a Radeon was unplayable in Cyberpunk 2077 and remains so. The Radeon RX 6800 XT is already crawling around in 1,920 × 1,080 at RT on "Medium" with just under 38 FPS, the performance loss is a whopping 63 percent. And with RT on High, the framerate drops to just under 26 FPS - that's 75 percent slower than without the rays. Ray tracing does not even have to be tried on the AMD flagship Radeon RX 6900 XT.

Where is this? RT is improved after the patch. We now have RT local shadows where we only had sun shadows previously so FPS may drop slightly as a result of turning on RT shadows since objects now cast RT shadows.
https://wccftech.com/cyberpunk-2077...-on-pc-thanks-to-partnership-with-nvidia/amp/
Still seems to run the same for me with all RT on and where available, set to Psycho.

Well lets be honest this thread is now just a troll. Has been for pages.![]()


FSR looks garbage v DLSS.
A good example on reddit showing not just why DLSS out performs FSR, but also can behave better than native.
https://www.reddit.com/r/nvidia/comments/svpvg9/dlss_in_cyberpunk_15_fixes_flickering_metal_fence/
This also proves that those who buy Nvidia have a more defined member![]()
Also there is something wrong with the RT in the game. I start out the game into the 50s but after an hour of gameplay, the fps just goes down into the forties and stays there permanently. The only way to fix is to reload the save and my fps goes back into the fifties.
On the whole that's a decent analysis but I'm not sure how they concluded the 3080 'run out of vram'. Typically when there's not enough buffer memory on the graphics card the minimums fall of a cliff as the game has to send data to slower system memory but looking at the 1% results that's not happening.Damn, even Nvidia sponsorships aren't enough to keep the vram thirst at bay (at least not past the launch window winkwink nudgenudge):
https://www.computerbase.de/2022-02/cyberpunk-2077-patch-1.5-benchmark/
inb4 "bUt yOu dOn'T uNdersTanD, thE gAmE is unOptimisEd!1!!1"
![]()
On the whole that's a decent analysis but I'm not sure how they concluded the 3080 'run out of vram'. Typically when there's not enough buffer memory on the graphics card the minimums fall of a cliff as the game has to send data to slower system memory but looking at the 1% results that's not happening.

The only thing I can take from this is that the ALL cards run out of grunt before running out of VRAM unless of course you wish to play sub 20 FPS or worse.
