• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

Have to say the 3090 purchase is feeling good right now, this is a 29 page thread full of issues/fix's I can just ignore and carry on playing at max with no issues.

Only thing I did is cap my frames at 60 to keep the heat/noise down from the GPU, was getting around 76-82 maxed out at 3440x1440, ray tracing on DLSS in quality mode but my room was like an oven and I could hear the fans through my open back headphones.
 
Have to say the 3090 purchase is feeling good right now, this is a 29 page thread full of issues/fix's I can just ignore and carry on playing at max with no issues.

Only thing I did is cap my frames at 60 to keep the heat/noise down from the GPU, was getting around 76-82 maxed out at 3440x1440, ray tracing on DLSS in quality mode but my room was like an oven and I could hear the fans through my open back headphones.
easy just do some undervolting, got my 3090 to faster Core clock & 12C lower temps by doing this.

3090 sits at a constant 2000Mhz now at 850Mv at 68C

vs
averaging 1850mhz and juping to 2000mhz at 1025Mv and averages 78-80C


also saves 50-90W of heat :)
 
Not a 'fair' comparison, but the RTX 3070 or RTX 2080 TI with DLSS (Balanced) gets about double the min. FPS of the RT 6800 at 4K ultra:

13054621647l.jpg


AMD needs to get their s*** together and work with game devs to support an alternative to DLSS.

I think it's pretty terrible for PC gamers that there isn't some non proprietary option, supported by both AMD and Nvidia.

RT still not worth the performance hit in my view...
 
Last edited:
easy just do some undervolting, got my 3090 to faster Core clock & 12C lower temps by doing this.

3090 sits at a constant 2000Mhz now at 850Mv at 68C

vs
averaging 1850mhz and juping to 2000mhz at 1025Mv and averages 78-80C


also saves 50-90W of heat :)

Actually thats a great idea, ill give it a try thanks
 
Does the game combine temporal AA with DLSS?

Or, disable the built in temporal AA with DLSS on?

Personally, I like the temporal AA, but I'm playing on an R9 390 at less than 1080p :D
 
Hopefully, CDPR can bring DLSS to the Witcher 3 next year too.
Does it need it?

Even old cards can run it quite well at 4k.(1080ti)
However of it's an easy addition as Nvidia suggests then don't see why not

I tried it today actually 4k everything maxed on 3090 148fps

Redownloading all the visual mods to go through it as I got 490 hours into w3
 
AMD needs to get their s*** together and work with game devs to support an alternative to DLSS.

I think it's pretty terrible for PC gamers that there isn't some non proprietary option, supported by both AMD and Nvidia.

RT still not worth the performance hit in my view...

Sad thing is that it's always upto AMD to bring a non-proprietary feature so everyone can use it. Nvidia does not entertain such a thing at all. AMD should really start playing their game too.
 
DLSS textures have a very washed out / noisy look to them, especially with MSAA enabled. Not a very good fudge (the upscaling ) at all.
DLSS is TAA based with many of the issues of TAA. DLSS quality is sharp and detailed even on a 2060. Performance is another story.
 
Sad thing is that it's always upto AMD to bring a non-proprietary feature so everyone can use it. Nvidia does not entertain such a thing at all. AMD should really start playing their game too.

Nice PR statement. Sad that some companies inovate and others cry proprietary. There are two companies in the RT race. AMD and Nvidia. Its not nvidia fault AMD are morons. Nvidia worked out you needed DLSS and created DLSS for their cards. No company is going to help a competitor get ahead. AMD is crying about open source because they have nothing to match DLSS. The whole statement can be traced back to an AMD PR representative in a youtube video. Its MS that is creating super-resolution by working with AMD for xbox. DirectML allows any company to create a GPU and then run THEIR DLSS like feature via DirectML. You have to create your own DLSS feature by yourself for your GPU. DirectML does not come with DLSS or super-resolution. Its just a EDX12u API to allow you to create one. If AMD have not spent the development money to create and match DLSS 2.1. They should face the consequences. They have watched nvidia for years working on DLSS. MS wants a DLSS feature for xbox. Why could AMD not get the message? AMD have no tensor cores or other ways to speed up super-resolution over the slower compute method. This is becomes a DLSS feature was never the plan on AMD hardware.
 
Sad thing is that it's always upto AMD to bring a non-proprietary feature so everyone can use it. Nvidia does not entertain such a thing at all. AMD should really start playing their game too.

DLSS is non-proprietary to the extent it uses Direct X (Direct ML)... Nvidia worked with MS when implementing it. Nvidia named it "DLSS" for their cards but when AMD finally add support it will use the same DX calls as Nvidia. The only difference is how AMD need to implement it because they have no Tensor cores. All the work Nvidia did when working with MS is there for AMD to use. The lack of Tensor cores will mean it is unlikely to perform as well on AMDs current hardware, but that's the fault of AMD not including separate cores to offload the work from the graphics pipeline.

So, Nvidia worked with MS to build something that AMD can also use and you call Nvidia the bad guy because they're the ones who decided to innovate and increase performance... That boggles my mind :rolleyes:
 
DLSS is non-proprietary to the extent it uses Direct X (Direct ML)... Nvidia worked with MS when implementing it. Nvidia named it "DLSS" for their cards but when AMD finally add support it will use the same DX calls as Nvidia. The only difference is how AMD need to implement it because they have no Tensor cores. All the work Nvidia did when working with MS is there for AMD to use. The lack of Tensor cores will mean it is unlikely to perform as well on AMDs current hardware, but that's the fault of AMD not including separate cores to offload the work from the graphics pipeline.

So, Nvidia worked with MS to build something that AMD can also use and you call Nvidia the bad guy because they're the ones who decided to innovate and increase performance... That boggles my mind :rolleyes:


Except that nothing like the reality. DLSS requires game devs to submit ultra high quality art work to nvidias own deep learning computer network. Thats where DLSS comes from, its proprietary. DirectML is part of DX12 and not the same.
 
Back
Top Bottom