• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
My final play settings after much testing:

image.png



Notes:
In game, turned off:
Screenspace reflections (we have RT instead)
Film Grain
DoF
Motion Blur
Chroma
Lens flare

NVCP settings:
DLDSR 1.78x
Smoothing at 90%

There's a bug with the HDR where the nits level in the the settings measures higher than what's set. For my AW3424DW, using 800 nits eliminates the clipping and provide more detail in highlights. Thus my HDR settings:
max luminance = 800nits
Mid point (gamma/eotf) = 1.0
Paper white (HuD) = 100nits

FG below 60 is def not recommended to use. 60-80 is greatly improved. 80fps onwards is just smooth.
 
I think you kind of missed the joke, as in, if nvidia weren't about, we would be stuck with amd and never evolving onto better things, whether we like it or not, nvidia are always first and then amd come along "oh hang on guys, we also have a solution, we just need to start working on it and it'll be ready in 2 or 3 years and then take another few years to get as good as the comp."
Nvidia’s technologies are more about increasing their own profits though, raster performance was becoming less of an issue so let’s roll out RT to give people more reason to upgrade. DLSS let’s them sell lower tier GPUs as higher tier for more money with Frame gen a continuation of that and now we have ray recontruction to push even more RT effects to make cards like the 4090 seem weak so if people want these effects then they’ll need to upgrade again next gen.
 
I think you kind of missed the joke, as in, if nvidia weren't about, we would be stuck with amd and never evolving onto better things, whether we like it or not, nvidia are always first and then amd come along "oh hang on guys, we also have a solution, we just need to start working on it and it'll be ready in 2 or 3 years and then take another few years to get as good as the comp."

The ONLY reason we have raytracing at all is because they needed to give the tensor cores something to do, you talk like this was some grand elaborate scheme nvidia set in motion to evolve the gaming landscape. When in reality it's just a by-product of nvidia's AI hardon.
 
Nvidia’s technologies are more about increasing their own profits though, raster performance was becoming less of an issue so let’s roll out RT to give people more reason to upgrade. DLSS let’s them sell lower tier GPUs as higher tier for more money with Frame gen a continuation of that and now we have ray recontruction to push even more RT effects to make cards like the 4090 seem weak so if people want these effects then they’ll need to upgrade again next gen.

Raster was/is a problem for developers because of the amount of time it takes to get good results, nowadays, especially in development space, you simply can't let developers take years to get good results, this is largely why ray tracing was brought in the first place (and not just for gaming industry but the media industry where it has been used for yonks but gaming is the first time to my knowledge where it is real time ray tracing), of course, you could say it is purely just to drive more money to force us to upgrade to the new shiny shiny but can you name me a company that doesn't want this? If you don't want to be stuck in a cycle of upgrading then simply turn off the settings or/and make do with a lesser experience, no one is holding a gun to peoples head to force them to buy into this.

Don't think that just because amd do open source is from the bottom of their heart, it's simply because they have zero choice when they are last to the market and with a lesser experience/solution.

Also, there are plenty of games released recently that are raster only and they still run like ****, in some cases even worse than full path tracing..... see starfield for a prime example.

Another point as well is that raster has been around for yonks, as Bryan said it perfectly, people bang on about upscaling and frame gen being fake, well so is every single frame with raster effects lol..... If anything, frame gen, dlss combined with path tracing is probably more "real" than raster native and that's not a joke either.... Developers have had decades to get to know how to "hack" around raster methods, give ray tracing the same time and it will simply not even be a raster vs ray tracing thing.

The ONLY reason we have raytracing at all is because they needed to give the tensor cores something to do, you talk like this was some grand elaborate scheme nvidia set in motion to evolve the gaming landscape. When in reality it's just a by-product of nvidia's AI hardon.

DLSS is running on the tensor cores too:


To say the "only" reason is just silly, you're ignoring everything that ray tracing can and will provide over dated methods and this isn't just made up crap of "oooh shiny puddles" as proven/backed up by anyone who works in the industry. Rome wasn't built in a day.

And well, the results speak for itself tbh.

People really need to stop viewing ray tracing as a nvidia thing....
 
DLSS is running on the tensor cores too:


To say the "only" reason is just silly, you're ignoring everything that ray tracing can and will provide over dated methods and this isn't just made up crap of "oooh shiny puddles" as proven/backed up by anyone who works in the industry. Rome wasn't built in a day.

And well, the results speak for itself tbh.

People really need to stop viewing ray tracing as a nvidia thing....


Oh so we have one performance tanking method running on Tensor cores, and also the performance uplift method running on the same cores (incredibly briefly), they're basically intertwined, one wouldn't exist without the other or they could never have launched RTX without having some sort of solution to the performance tanking when using it.

And going by that article they make it seem that nvidia using the cores for dlss is only a recent thing as i don't recall it being mentioned before even with extensive articles into how dlss\ray tracing works.
 
Last edited:
Oh so we have one performance tanking method running on Tensor cores, and also the performance uplift method running on the same cores (incredibly briefly), they're basically intertwined, one wouldn't exist without the other or they could never have launched RTX without having some sort of solution to the performance tanking when using it.

And going by that article they make it seem that nvidia using the cores for dlss is only a recent thing as its never been mentioned before even with extensive articles into how dlss\ray tracing works.

A large part of the reason we have "tanking" performance in most ray tracing titles is because it has been tacked on rather than having a game developed from the get go to get the best from it, again, see metro ee for a perfect example of where it runs better than the raster + tacked on RT method, even consoles can run it extremely well at 60 fps. CP 2077 path tracing in the grand scheme of things actually runs very well considering it is a huge open dense world with a lot going on compared to something like portal.....

And well yes, of course we need some other way to have usable performance rather than relying purely on hardware to address how incredibly intensive ray tracing is, to rely on hardware is simply not there and won't be for a long time and well if you wanted the hardware only approach as of now, expect to pay even more than what a 4090 would cost.

Or maybe it's just the first time someone has actually taken a proper look into it? Either way, it is used so it's not quite the "waste of sand" that some like to make out as we have no idea if nvidia were to offload it elsewhere, the results might not be as good and so on. Same way google had a seperate image processing chip for the camera on their pixel phones as it simply worked better and did a better job rather than running on the main cpu chipset.



In other news, someone has already got ray reconstruction working on normal ray tracing:

 
Tried the game at 720p with DLSS Quality - it runs above 30FPS. @KompuKare - looks like I was right! :cry:
Well, an RTX 3050 is a bit slower:
QdnqGHF.png

I guess if I had that game, I would try it at sub 5 FPS just to see what all the fuss is about but it would be a slideshow.

Did have a look at a few screenshots. Especially with PT. The lighting did look better most of the time, and unlike the last time I looked the PT wasn't as dark in what I saw.
Aside from not being able to run it, my problem remains that I don't like the art direction of Cyberpunk 2077 in the first place. And I keep feeling that everything has fancy lights but the models seem so-so. Like low-polygon, lots of fancy lights.
 
Geforce now ultimate life :p

WZOfA0i.png


omAKuEK.png





Just lovely this:

- zero cpu and gpu usage on my end
- better fps
- better latency even with frame gen and streaming

me360F3h.jpg


- better visuals
 
But how good is it really though?

in the meantime, fancy a bit of a laugh at some crazy numbers?


:D

And for some DLSS 3.5 + Frame Gen + Buttery smooth visuals, here's 2077 2.0:


The HDR option takes a few days to process on youtube apparently, wtaf is that all about!
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom