• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
Minecraft, chap. Pretty popular!

Speaking of minecraft, can't see a difference tbh:


:cry:


Glad it works nicely.
The second one is performance, I can tell by Geralt's armor. But it still looks great and stable 60 is worth it for me. It helps that I am playing on 1080p plasma so the image being upscaled to 1440p and then downscaled looks great.

And yeah sharpening on high is too noisy, low is ideal.

Spot on. DLSS performance has come along way over last couple of months, never was usable except at 4k and when sitting further back.





 
When PS5 games were having 30fps 'quality' modes people were like 'omg 30fps lol peasants'. Now we are spending a grand, cranking up some settings on ancient games and rocking 30 fps like Kings.

Who's playing at 30fps? Well, let me refrain who's with Nvidia playing at 30fps. 30fps would be generous for AMD, it's closer to 3fps than 30fps
 
Last edited:
Started playing darktide again since it appears to have had quite a few patches and performance is much better and more consistent for me now. Really am impressed with dlss performance now:


Sharpness setting:


Is a rather nice looking game world tbf, as Alex said, screenshots really don't do it justice though, with the fog and lighting in motion, it looks very atmospheric (these were with dlss balanced)

56a7lk2.jpg

OWEQ6j3.jpg

KgpZQYv.png

CEjZpco.jpg
 
Last edited:
Really like computerbase approach to their witcher 3 benchmark where they are letting people upload their results, good this as you get a variety of hardware (and not always just the best hardware all round being tested):


Xwwbz4k.png

gfYAnnX.png

GzDbM6Q.png
 
  • Like
Reactions: TNA
As I suspected...
There is. [I analysed a frame using NVIDIA's Nsight Graphics GPU profiling tool, and the actual raytracing work (the light purple bars) only accounts for maybe 15-20% of the total frame time at native 1440p on a 3090.](https://i.imgur.com/RGadSCR.png) 35-40% is taken up by the raytracing scene building (the orange bars) which doesn't scale with screen resolution, and 95+% of that work is known to be inefficient in design (to the point where I believe the typical suggestion is to do it as infrequently as possible via instancing).
The rest is (seemingly) unrelated graphics or compute work (though I didn't delve too far into that, since I saw the 35-40% of the scene building and was just flabbergasted). Raytracing isn't the only problem with this, [I saw two passes that dwarfed everything else in CPU time](https://i.imgur.com/RZZfdM0.png) (the cursor with 133.897ms over it is the same cursor with 1.96134ms over it in the GPU time screenshot I linked above, just so you can see where all the other work ends up in terms of CPU time), but something is *definitely* wrong with their raytracing, whether it be a bug, an oversight or bad design.
 
Great post @Calin Banc :D :cool:

That is one of my biggest annoyances now with raster, the light leaking/bleeding through solid objects, you can see it in some of my witcher 3 screenshots too, that one and reflections disappearing/distorting just because you move the camera angle slightly is annoying as well :mad:

Well... the devs have to implement everything in also. For instance, in Cyberpunk and TW3, from I can see, a fire pit or a car on fire (in Cyberpunk, of course), does not create a shadow.

Plenty of games, RT or rasterization, doesn't really matter, only use a handful of shadow casting lights apparently, although dx12/vulkan/mantle should have solve some of the performance issues of the older APIs. So, without devs actively contributing to this... not all problems will be solved.




Also, speaking of new new techniques, stuff such as HFTS (which is done in rasterization), at least in Watch Dogs 2 (in The Division the day cycle changes too fast to properly test it in some specific scenes), is a greater performance hog than RT shadows in Cyberpunk (at least on my rtx2080). On the other hand, Doom 3 had working mirrors while in Cyberpunk you have to activate them manually + your character doesn't get reflections all the time :))



As I suspected...


Well, if the trees are moving, doesn't that mean you have to calculate again their state? Same if you'd have particles or any moving object for which you'll want dynamic shadows and reflections?
 
Last edited:
Not RT related but decent comparison of FSR and dlss in witcher 3:


Interesting bit on the frame generation:

In other DLSS Frame Generation supported games we've tested, all of them had issues with the in-game on-screen UI, which had a very jittery look—the DLSS Frame Generation implementation in The Witcher 3: Wild Hunt does not have this issue.
With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect more than doubled performance at 4K and 1440p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with the input latency.

Pretty sure techpowerup complained about latency when dlss 3 first came out?
 
Seems FG/dlss 3 latency has been improved/reduced somewhat:
Certainly not for that example - input latency is (slightly) worse (than without frame generation) at around 50fps-ish/59ms whereas the input latency for an *actual* 115fps in this example would be more than halved to 28ms or less.
 
Last edited:
Certainly not for that example - input latency is (slightly) worse at around 50fps-ish/59ms whereas the input latency for an *actual* 115fps in this example would be halved to 28ms.

It is a noticeable improvement compared to what we saw at launch with spiderman and cyberpunk figures though iirc, in them games, there was a 10+ms increase with FG on although obviously not like for like and different game engines (which as shown by DF also have a big impact on default latency). Of course it will never match a true native 100+ fps latency but as it is, there is no way around the severe cpu bottleneck issue other than FG so a case of pick your poison, well not even really that imo....

- stick with <60 fps at 56ms of latency

Or:

- get the smoother motion/fluidity of 100+ fps with pretty much the same latency, can't see any real downsides to this option

Personally I know which I would rather have gaming on a 175hz screen.
 
Status
Not open for further replies.
Back
Top Bottom