• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
This is going to make me want dlss 3/40xx badly :(


Think this could be the worse performing title we'll have yet as don't know about anyone else but always looks stuttery/low fps....

The first game was very performance intensive, it released just as a got my 2080ti and the game ran at 4k 50fps and that was without Ray tracing
 
When undervolting 3 series in afterburner with a set voltage against a set boost clock, does a factory overclock on the boost get added on, or is this taken into account with whatever you set?

Say you set 1875mhz at x voltage capped to go no higher, have you instead actually set x voltage at 1925mhz if the factory oc is +50mhz on the boost?

Personally no, On it's own my card will boost to 2040, I set it to 1980 at 0.9mv and it stays there. Also have a 2nd profile for 0.8mv at 1600 when I'm playing non demanding games it also just stays there.
 
Last edited:
I find it curious that nVidia didn't provide benchmarks for any 'showcase' RT games for the 40xx series other than Cyberpunk (which was tested with a build that specifically targets the 40xx series RT features) - where was Metro: Exodus Enhanced? Dying Light 2? Instead, the only game with RT they showed was Resident Evil Village which had a nice uplift (1.75x?) but was far from 2-4x performance.
 
LMAO the video was uploaded in 30 FPS and people are judging the games FPS by the video.
Whatever next!

True that, however, it still looks noticeably more stuttery than other 30 fps gameplay videos, especially when you factor in the way they have captured/shot the trailers/gameplay scenes i.e. either incredibly slow panning shots of the landscapes, slowed down footage/capture of intense scenes, a lot of cinematic footage or fast action where you have a **** ton happening on screen to distract from performance.

If this runs well on anything but the 40xx, I'll be amazed.

I find it curious that nVidia didn't provide benchmarks for any 'showcase' RT games for the 40xx series other than Cyberpunk (which was tested with a build that specifically targets the 40xx series RT features) - where was Metro: Exodus Enhanced? Dying Light 2? Instead, the only game with RT they showed was Resident Evil Village which had a nice uplift (1.75x?) but was far from 2-4x performance.

Agree, it would have been nice to see more rt title comparisons.

Was the 2-4x not in reference to when using dlss 3?
 
Was the 2-4x not in reference to when using dlss 3?
Jensen never explicitly stated it but yes, the 'up-to' 4x performance increases are going to be DLSS3 - also interesting to note on the benchmarks was that all DLSS-capable games (2.x *and* 3.x) were run at 'performance' mode - expect the results to be less impressive if you prefer DLSS Quality or Balanced.

*Edit* Just thought of another one - Minecraft - one of the biggest games in the world and it's path-traced iteration is pretty darn heavy. I suspect that to take advantage of the new RT improvements requires some re-architecting hence why the only 'real' game demoed was Cyberpunk (Portal was made with nVidia's RTX Remix and the RC game was nVidia's Omniverse I think).
 
Last edited:
HDR and PC gaming don't mix - Here's why



Worth a watch and explains some of the issues with HDR on PCs.


Thought to stick it here as it's a wanted thing with RT and other things in this thread. Basically chasing better visuals.
 
HDR and PC gaming don't mix - Here's why



Worth a watch and explains some of the issues with HDR on PCs.


Thought to stick it here as it's a wanted thing with RT and other things in this thread. Basically chasing better visuals.

Decent video and kind of sums up my thoughts on when people say "HDR sucks on pc", you then ask them what monitor they are using and turns out it's a **** one that has some useless HDR400/600 certification and no dimming tech. of any kind :cry: IMO, if you want to see what HDR can do, it has to be viewed on an oled/self emissive pixel display, even full array local dimming pales compared to oled for hdr imo.

When possible and with proper native HDR support, I'll always use HDR on either my AW QD-oled or LG e7 as it completely changes the visuals of a game to the point of it even looking "next gen", couldn't figure out why COD looked so bad one day and turns out HDR had deactivated, literally was like a ps3 to ps5 difference.

Sadly a lot of games HDR is **** poor though, some of the best ones are sonys ports i.e. spiderman, days gone, god of war, HZD, in terms of cross platform titles, main ones would be titles like division 2, mass effect andromeda, assassins creed odyssey, guardians of the galaxy, SWBF 2.
 
It's not necessarily the HDR quality (though I agree that is a big issue on weaker panels), but also how games register HDR as being on or not - As an example, if I'm playing F1 22, I'll turn HDR on my monitor, start the game, and it will look normal. After a random period of time, BAM, HDR turns on. I've not yet found out what triggers it, but you immediately see the difference in picture quality.
 
HDR and PC gaming don't mix - Here's why



Worth a watch and explains some of the issues with HDR on PCs.


Thought to stick it here as it's a wanted thing with RT and other things in this thread. Basically chasing better visuals.
Thanks that informed me quite a bit, so what's the hdr 1400? The best? Are Tvs still the go to even with the new alienware 34inch and asus screen that is coming out? Nvm, googled it!
 
Last edited:
It's shilling time.



This tells me all I need to know about DLSS3...


Nl5ULzl.jpg


Basically we will artificially neuter DLSS 2 performance and pretend DLSS 3 is amazing as we did with DLSS 2 before..

Just look at the numbers on that and then look at numbers from games before they introduced DLSS 3. Also not liking the fact of the fake frames rubbish and the clear side effect is Soap Opera Effect but we will slow videos down by half frame rates to show you how amazing it looks at half speed because of youtubes 60fps limits... Blaa blaa really... OMG do they really think we are that stupid especially people my age and have been computer enthusiasts longer than Alex has been alive.. Not falling for this one Nvidia sorry, fake frames = fail to me. We did all this with TVs and any TV enthusiast knows to turn them features off, but now Nvidia wants to turn them back on your TV if you game on TV and worse add that horrible effect to monitors now.
:rolleyes:
 
Last edited:
Basically we will artificially neuter DLSS 2 performance and pretend DLSS 3 is amazing as we did with DLSS 2 before..
Given that frame generation ostensibly doubles the frame rate of whatever the DLSS 2.x upscaler is producing, did you notice that Cyberpunk (without the frame-doubling) was only 25% faster than the 3090Ti..?

The Spider-Man DLSS numbers are really odd - I get a much better uplift than DF showed (could be my CPU but still..?)

I like DF but this still feels more like marketing than journalism - they should've just waited until the embargo lifted.
 
DLSS 3 looks like a successful new version tbh. Not quite so excited yet because I need to test it myself first, but regardless of how it turns out it was a necessary next step in the pursuit of greater motion clarity (like BlurBusters' dream of 1000hz). Clearly besides Doom Eternal no one's really coding games with really high framerates in mind so someone needed to step in and push it forward if we are to ever get there. Hopefully Nvidia pushes adoption hard because as cool as it might be it sucks that these new additions require dev integration again rather than Nvidia handling it through their sdk somehow. Most of the older dlss titles are never going to get it, I imagine, and that's a shame.
 
DLSS 3 looks like a successful new version tbh. Not quite so excited yet because I need to test it myself first, but regardless of how it turns out it was a necessary next step in the pursuit of greater motion clarity (like BlurBusters' dream of 1000hz). Clearly besides Doom Eternal no one's really coding games with really high framerates in mind so someone needed to step in and push it forward if we are to ever get there. Hopefully Nvidia pushes adoption hard because as cool as it might be it sucks that these new additions require dev integration again rather than Nvidia handling it through their sdk somehow. Most of the older dlss titles are never going to get it, I imagine, and that's a shame.

A good number of current/older games have already been confirmed to get dlss 3:


More than what games have FSR 2 let alone FSR 2.1 :p
 
A good number of current/older games have already been confirmed to get dlss 3:


I'm always up for games getting DLSS but The Witcher 3 ? You can run it at 4K max settings and get around 100FPS with a current gen card... unless it's for the supposed "Next Gen" update coming soon ?
 
Last edited:
Decent video and kind of sums up my thoughts on when people say "HDR sucks on pc", you then ask them what monitor they are using and turns out it's a **** one that has some useless HDR400/600 certification and no dimming tech. of any kind :cry: IMO, if you want to see what HDR can do, it has to be viewed on an oled/self emissive pixel display, even full array local dimming pales compared to oled for hdr imo.

When possible and with proper native HDR support, I'll always use HDR on either my AW QD-oled or LG e7 as it completely changes the visuals of a game to the point of it even looking "next gen", couldn't figure out why COD looked so bad one day and turns out HDR had deactivated, literally was like a ps3 to ps5 difference.

Sadly a lot of games HDR is **** poor though, some of the best ones are sonys ports i.e. spiderman, days gone, god of war, HZD, in terms of cross platform titles, main ones would be titles like division 2, mass effect andromeda, assassins creed odyssey, guardians of the galaxy, SWBF 2.


I haven't been using HDR much on my QD-OLED. Biggest problem I have is: the monitor defaults to max 1000nits brightness in HDR and you can't change it in the monitor settings - do you know how bright 1000 nits is when the screen is in your face and you're in a dark room, it's freakin stupid. So why not just lower brightness in the game settings menus? That would be great except many HDR games have **** poor HDR settings options for this
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom