• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

I think a modern game having to rely on DLSS, even with the latest hardware, to get 60+ FPS is a bit poo. Is it raytracings fault? Poor poptimisation? Or are they simply pushing the boundaries on computer graphics?
DLSS was marketed so agresively Nv and most YT influencers would have you believe it's better than native, but it is what it is, imo it's a better upscaling but it's never been better than native.
 
DLSS was marketed so agresively Nv and most YT influencers would have you believe it's better than native, but it is what it is, imo it's a better upscaling but it's never been better than native.
I think DLSS is amazing, don't get me wrong, but you shouldn't need it on a 3080/3090. I guess it could start a trend of developers relying on users having it, which would make the experience **** for everyone else. Bit worrying.
 
Tom's Hardware: "AMD's latest RDNA2 GPUs did quite well at medium quality, but the move to ultra quality puts the 3080 and 3090 at the top of the charts. It's probably just a matter of memory bandwidth, especially at 1440p and 4K. There's only so much a large L3 cache can do before the GPU simply needs a wider and faster memory interface, and GDDR6X provides that on Nvidia's 3080 and 3090."

Chart: 3-4fps difference between the 3080/3090 and the 6800XT/6900XT at 4k.
 
Tom's Hardware: "AMD's latest RDNA2 GPUs did quite well at medium quality, but the move to ultra quality puts the 3080 and 3090 at the top of the charts. It's probably just a matter of memory bandwidth, especially at 1440p and 4K. There's only so much a large L3 cache can do before the GPU simply needs a wider and faster memory interface, and GDDR6X provides that on Nvidia's 3080 and 3090."

Chart: 3-4fps difference between the 3080/3090 and the 6800XT/6900XT at 4k.

It's like the authors of these articles think the readers are all 10 years old and not very bright.
 
DLSS is great for what it really is, post-processing image enhancement, its not a magic wand that will turn a 720P image into a 1440P image, this is a case of Nvidia never missing a marketing trick "with this button you can get 60% more FPS" not without the cost of image quality it isn't, and yes tech influencers have fallen for it hock line and sinker.
 
DLSS is great for what it really is, post-processing image enhancement, its not a magic wand that will turn a 720P image into a 1440P image, this is a case of Nvidia never missing a marketing trick "with this button you can get 60% more FPS" not without the cost of image quality it isn't, and yes tech influencers have fallen for it hock line and sinker.

They know the Emperor is only wearing his underwear but they're happy to compliment his robes ;)
 
it basically is that magic wand. i don't know anyone who is playing Cyberpunk with it off, even the biggest IQ snob won't turn down a 50% FPS increase for an image that is functionally identical in a game as demanding as CP2077.

assuming a target of 60FPS, DLSS+full RT looks a generation ahead of standard raster+no DLSS.
 
Is it likely to be about a 30-40% performance boost, with DLSS enabled at 1440p and 4K resolution? If so, probably won't be hitting 60 FPS at 4K on an RTX 3070.
 
DLSS is great for what it really is, post-processing image enhancement, its not a magic wand that will turn a 720P image into a 1440P image, this is a case of Nvidia never missing a marketing trick "with this button you can get 60% more FPS" not without the cost of image quality it isn't, and yes tech influencers have fallen for it hock line and sinker.

I like a cool and quiet system. If DLSS means a smaller workload then that's a good thing. DLSS v1 was bad, I don't think even the biggest fanboys would deny that. V2 however seems to be doing a good job.

 
I think a modern game having to rely on DLSS, even with the latest hardware, to get 60+ FPS is a bit poo. Is it raytracings fault? Poor poptimisation? Or are they simply pushing the boundaries on computer graphics?

It's dx12's fault. Dx12 seems like an API meant for uber nerds at id software the rest are just going to choke on it. There's no point building proprietary game engines anymore due to increased development complexity, just license one from the market.
 
They know the Emperor is only wearing his underwear but they're happy to compliment his robes ;)

Yea. Pretty much this.

I like a cool and quiet system. If DLSS means a smaller workload then that's a good thing. DLSS v1 was bad, I don't think even the biggest fanboys would deny that. V2 however seems to be doing a good job.


DLLS 2 is much better than DLSS, but that is not saying much given that DLSS was a very low comparison base, it was bad, plain bad and while DLSS 2 is not at all bad its not as good as native resolution. Its not a resolution replacement tool.
 
it's a very demanding game even at pure raster. the old consoles are basically choking to death on it, and the new consoles have to strip features back just to run the last gen version at 60FPS.

and that's all without any RT, which on ultra has about 5 different layers of implementation ranging from shadows to reflections to dynamic GI. it's pushing the envelope more than crysis did. that we have a technology in DLSS that allows us to play this at playable framerates at all, let alone 60FPS at 4k with no real world drop in IQ, is a miracle.

i love this video, reminds us of how crysis ran on a top of the line rig at the time:


what we'd have done for dlss 2 back then.
 
Last edited:
Its not a resolution replacement tool.

It depends. If you want higher level of detail than current rendering can provide at a useable frame rate then it does indeed become a resolution replacement tool. Does the detail exactly match that which the developers had placed? No. DLSS has replaced that intended detail with what ML thinks fits best.
 
Can the consoles handle the game at 60 FPS at 4k resolution?

Not without a lot of work, If they want Ray Tracing on consoles its going to need a whole lot more work because being an Nvidia title its no doubt designed for their architecture, like Control, and we all know how horrendously bad the RT performance is in that game on the RX 6000 cards, the FPS are barely into double digits, its basically completely broken. To get that to work well and work well it will need to they will have to rewrite the code for AMD's architecture. Have fun doing that CD Projekt.
 
Not without a lot of work, If they want Ray Tracing on consoles its going to need a whole lot more work because being an Nvidia title its no doubt designed for their architecture, like Control, and we all know how horrendously bad the RT performance is in that game on the RX 6000 cards, the FPS are barely into double digits, its basically completely broken. To get that to work well and work well it will need to they will have to rewrite the code for AMD's architecture. Have fun doing that CD Projekt.

Control doesn't run too badly at all on RDNA2 when you consider how little dedicated RT hardware there is.

https://www.eurogamer.net/articles/digitalfoundry-2020-amd-radeon-rx-6800-and-6800-xt-review?page=5
Code:
Performance Analysis | Control: DX12, High, High RT, TAA, 1440p

RTX 3080   70
RTX 3090   78
RX 6800    39
RX 6800XT  43
 
Back
Top Bottom