• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

Not quite sure the point of your post TBH - I am quite familiar with these techniques and have coded more than 1 of them myself.

I was demonstrating in response to certain people that the type of reflections on water in Bioshock Infinite are not particularly new* and that there are reasons why those kind of reflections haven't been used widely in games which ray tracing does solve - the reason for using an environment map like that was because at the time things like pixel shaders, etc. didn't exist and you were left working with fairly basic tools and doing almost anything that wasn't a standard DX/OGL function via CPU.

* For other examples see for instance Serious Sam, x-isle demo (precursor to Far Cry) https://youtu.be/NwHEdySGbck?t=41 etc. etc.

It comes with code you can use. Interesting for the pathtracing project.
 
Last edited:
yeah I need a need cpu

also DLSS off vs quality slider. It smears the lighting as well and it looks like its coded to reduce the shadow distance. But that could just be the game being bugged.https://imgsli.com/MzYyMTQ

RT ON DLSS OFF
N0BWHP3.jpeg


RT ON DLSS Quality
3ty9el7.jpg


RT ON DLSS Ultra performance smear
KX7tAHe.jpeg

I have done every quest and all the endings. I even bought all the cars. I get massive FPS drops with RT on for no reason. Closing the game, it hangs in steam. I then have to close steam. If I restart the game and load steam. The FPS returns to normal but there are areas that do give low fps. Raster gives no issues at all like that.
 
Last edited:
Only time I've had any crashing was when messing about with the game in Windows 7 - in 10 so far I've not had a single crash and performance is pretty consistent up until around 2 hours of constant playing where it suddenly takes a dive - from what I can see at that point VRAM use jumps up around 500MB (still under the VRAM amount of my GPU) and RAM use quickly goes up around 2GB and it seems to be constantly streaming a lot of data between RAM and GPU - I'm guessing it is encountering some kind of active resource management scenario to try and stay within a certain allocation/pool. It will happily use over 8GB VRAM if I turn it up to 4K without DLSS though.
 
see edit before you posted this

I played around with DLSS and found the same. At 1440p Ultra Performance and Quality didn't differ that much difference. Perhaps this is due to Quality using a 1080p source, while lower versions use a lower resolution. Ampere is not so good at the lower resolutions as can be seen with comparisons to RDNA2.
 
Last edited:
I played around with DLSS and found the same. At 1440p Ultra Performance and Quality did differ that much difference. Perhaps this is due to Quality using a 1080p source, while lower versions use a lower resolution. Ampere is not so good at the lower resolutions as can be seen with comparisons to RDNA2.

This is just not true. Ampere is faster than RDNA2 at all resolutions. The only time the RDNA2 cards pull ahead is with SAM. https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/35.html



As you can see in this bar chart. The RTX 3080 beats both the 6800xt/6900xt without SAM at 1080p.



As you can see in this bar chart. The RTX 3080 beats the 6800xt without SAM at 1440p and is 1% behind the 6900xt. SAM is a feature of one CPU series and should not be used in benchmarks for gpu's. Same with resizable bar. The results of SAM and reusable bar should be separate they are cpu features. The primary PCIe lanes are on the CPU and it takes a BIOS update to enable SAM and resizable bar. Atm resizable bar looks like it could be even better on Intel z490 systems. https://youtu.be/2CLYpfCY9iw

Up to 20% extra performance for resizable bar of Intel z490 systems https://www.tomshardware.com/uk/news/resizable-bar-intel-z490-motherboard-benchmarks
 
Last edited:
This is just not true. Ampere is faster than RDNA2 at all resolutions. The only time the RDNA2 cards pull ahead is with SAM.

Please have the decency to base your assessment on more than just the TPU numbers. There are plenty of reviews showing 6800 XT beating the 3080 at 1080p and 1440p. The truth is they are so close in most games even at 4K that the few percent swings mean nothing. There are some outliers here and there for both but the reality is making overly generalised claims that Ampere beats RDNA2 at all resolutions is not accurate.

Apart from a few RT and DLSS games Ampere and RDNA2 are very closely matched in the majority of modern games. On the other hand RDNA2 offers 16GB VRAM. My own tests from a 6800 (non XT) and 3080 FE in CP2077 and other recent games such as WDL had the 6800 no more than 15% slower at 4K. When I enabled SAM this gap dropped by maybe 3% at 4K (would be higher at 1080p and 1440p). So like it or not SAM is a thing and for those with a 5000 CPU and RDNA2 GPU it is free GPU performance. So again the performance difference apart from some outliers is very close at all resolutions.

If RT and DLSS are essential, then get Ampere. If not then the extra VRAM of RDNA2 is a big bonus, especially at 4K. Well that is assuming you can find one of course :)

See, I can cherry pick charts as well.
uqwaccX.png


21322291-16057326335325081_origin.png
 
Last edited:
Please have the decency to base your assessment on more than just the TPU numbers. There are plenty of reviews showing 6800 XT beating the 3080 at 1080p and 1440p. The truth is they are so close in most games even at 4K that the few percent swings mean nothing. There are some outliers here and there for both but the reality is making overly generalised claims that Ampere beats RDNA2 at all resolutions is not accurate.

Apart from a few RT and DLSS games Ampere and RDNA2 are very closely matched in the majority of modern games. On the other hand RDNA2 offers 16GB VRAM. My own tests from a 6800 (non XT) and 3080 FE in CP2077 and other recent games such as WDL had the 6800 no more than 15% slower at 4K. When I enabled SAM this gap dropped by maybe 3% at 4K (would be higher at 1080p and 1440p). So like it or not SAM is a thing and for those with a 5000 CPU and RDNA2 GPU it is free GPU performance. So again the performance difference apart from some outliers is very close at all resolutions.

If RT and DLSS are essential, then get Ampere. If not then the extra VRAM of RDNA2 is a big bonus, especially at 4K. Well that is assuming you can find one of course :)

See, I can cherry pick charts as well.
uqwaccX.png


21322291-16057326335325081_origin.png

Just to add to this, someone over on reddit decided to plot the average 1440p results on a per game basis. Ignore the blue line. Depending on what games you cherry pick you can form any conclusion you want.

https://www.reddit.com/r/Amd/comments/kog7z2/6800_xt_vs_3080_game_age_scatterplot_17_sources/

Wwj2TDb.png
 
Please have the decency to base your assessment on more than just the TPU numbers. There are plenty of reviews showing 6800 XT beating the 3080 at 1080p and 1440p. The truth is they are so close in most games even at 4K that the few percent swings mean nothing. There are some outliers here and there for both but the reality is making overly generalised claims that Ampere beats RDNA2 at all resolutions is not accurate.

Apart from a few RT and DLSS games Ampere and RDNA2 are very closely matched in the majority of modern games. On the other hand RDNA2 offers 16GB VRAM. My own tests from a 6800 (non XT) and 3080 FE in CP2077 and other recent games such as WDL had the 6800 no more than 15% slower at 4K. When I enabled SAM this gap dropped by maybe 3% at 4K (would be higher at 1080p and 1440p). So like it or not SAM is a thing and for those with a 5000 CPU and RDNA2 GPU it is free GPU performance. So again the performance difference apart from some outliers is very close at all resolutions.

If RT and DLSS are essential, then get Ampere. If not then the extra VRAM of RDNA2 is a big bonus, especially at 4K. Well that is assuming you can find one of course :)

See, I can cherry pick charts as well.
uqwaccX.png


21322291-16057326335325081_origin.png
And that's with SAM disabled. ;)
 
And that's with SAM disabled. ;)

I can't see SAM making up the difference...

DirectX Raytracing Feature Test

1 GPU
  1. Score 65.23, GPU 3090 @2250/5512, CPU 10900k @5.3, Post No.0491, Jay-G25 - Link Drivers 460.89
  2. Score 64.34, GPU 3090 @2235/5344, CPU 7820X @4.7, Post No.0489, anihcniedam - Link Drivers 460.89
  3. Score 63.87, GPU 3090 @2205/5328, CPU 6950X @4.401, Post No.0496, FlyingScotsman - Link Drivers 460.89
  4. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  5. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  6. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  7. Score 61.61, GPU 3090 @2115/5128, CPU 9980XE @4.5, Post No.0487, Greebo - Link Drivers 460.89
  8. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  9. Score 59.34, GPU 3090 @2070/4976, CPU 5950X @4.965, Post No.0474, Grim5 - Link Drivers 460.89
  10. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  11. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  12. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  13. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  14. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  15. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  16. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  17. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  18. Score 34.15, GPU 6900XT @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  19. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  20. Score 32.54, GPU 2080 Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  21. Score 29.91, GPU 2080 Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  22. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  23. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09
 
Great tests done by Sapphire!


I don't see AMD here with RT either.... Very strange :p


This last pic made me chuckle.

Thanks for the pics, that quest situation was the only time I actually wanted RT Reflections on (without the side-by-sides), so I was wondering how big the difference really was. Not as large as I expected, but it's gonna be greater in motion I guess.
 
Can anyone verify which version of DLSS is best to use at 1440p?

I play on balanced currently but have to turn ray tracing off. I have a 2080ti but can’t stand playing at around 50-60 frames even when reducing some settings from ultra.

Playing with ray tracing off gives me around 80-100 frames which is great with DLSS balanced. I do miss ray tracing.

Would I notice a significant difference if I went to performance?
 
Can anyone verify which version of DLSS is best to use at 1440p?

I play on balanced currently but have to turn ray tracing off. I have a 2080ti but can’t stand playing at around 50-60 frames even when reducing some settings from ultra.

Playing with ray tracing off gives me around 80-100 frames which is great with DLSS balanced. I do miss ray tracing.

Would I notice a significant difference if I went to performance?

If your native resolution is 1440p then 'performance' will be upscaling from a 720p image and you'll likely be CPU bottlenecked at that resolution.

I wouldn't advise it
 
Can anyone verify which version of DLSS is best to use at 1440p?

I play on balanced currently but have to turn ray tracing off. I have a 2080ti but can’t stand playing at around 50-60 frames even when reducing some settings from ultra.

Playing with ray tracing off gives me around 80-100 frames which is great with DLSS balanced. I do miss ray tracing.

Would I notice a significant difference if I went to performance?

I personally find lower FPS okay as long as the game doesn't feel sluggish. Try RT on with Windows hardware-accelerated GPU scheduling and Nvidia Low Latency Mode set to Ultra.
 
Back
Top Bottom