• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
It does indeed so developers need to use the power on hand but most are to lazy. PC has plenty of horsepower but brute force is the PC way.
And that's exactly what's been happening since forever on PC, XBOX was miles ahead of PC at that time, then came mgpu, I had many SLi/Xfire setups to brute force fps back in the day, and the devs have always got the blame , since way back then, nothings changed.

If Sony glued and fine tuned a 4060Ti to the PS5, it would probably kick a 4090's ass!:o
 
Think its £20 a month now Bill. Upping from 1st Nov.
As your 3090's never so much as farted in the vram department, therefore you can still play ultra textures on anything still, so suppose @Nexus18 gets a pass on £20 a month for a blockbuster title streamed, unless just like other forum users, you'd just disable RT'ing?
 
So here's the funny thing

Games are all built on PCs, and the console versions get tested on devkits, so you have a ps5 devkit in the office and you run your code on it and then tweak to get the performance at an ok level.

But with the PC version of the game, it's going to run pretty much how it runs on the developer's PC and here is where the problems start - all game developers are using gaming laptops with RTX4090 or desktops with RTX4090's in them and therefore games are optimized for that hardware, that's why games tend to run pretty well on a 4090 because they were built on 4090's

So while most gamers spending hours worrying about settings, a 4090 owner just turns the game on, selects the Ultra graphics preset and watches FPS go brrrr. I can think of at least half a dozen games I've played this year where at release there were major complaints about poor performance but I just opened the game, set ultra settings and got great performance, none of the issues mentioned in the media.

I know that's not very helpful but that's the reality, developers are optimizing games for the hardware they developed the game on which is almost always the best money can buy - and that's new that's how PC games have always worked
 
Last edited:
And some may wonder if games have always being optimized for the best hardware, then why is there so many more complaints of poor performance these days?

The answer is simple, it's because the difference between the top and bottom GPU has never been bigger. The performance between a 60 series and a 90 series has never been bigger than it is right now, that's why gamers with average GPUs feel like performance is getting worse and gamers with top GPUs, like myself, feel like nothing has changed
 
Last edited:
And some may wonder if games have always being optimized for the best hardware, then why is there so many more complaints of poor performance these days?

The answer is simple, it's because the difference between the top and bottom GPU has never been bigger. The performance between a 60 series and a 90 series has never been bigger than it is right now, that's why gamers with average GPUs feel like performance is getting worse and gamers with top GPUs, like myself, feel like nothing has changed
Fair enough, as I stated above, PC bruteforce is the way to go on PC, agree with your point top to bottom is widening at an alarming rate, it's looking like a good few years off for entry mainstream RT as Nv/AMD's greed has no ceiling.:(
 
So here's the funny thing

Games are all built on PCs, and the console versions get tested on devkits, so you have a ps5 devkit in the office and you run your code on it and then tweak to get the performance at an ok level.

But with the PC version of the game, it's going to run pretty much how it runs on the developer's PC and here is where the problems start - all game developers are using gaming laptops with RTX4090 or desktops with RTX4090's in them and therefore games are optimized for that hardware, that's why games tend to run pretty well on a 4090 because they were built on 4090's

So while most gamers spending hours worrying about settings, a 4090 owner just turns the game on, selects the Ultra graphics preset and watches FPS go brrrr. I can think of at least half a dozen games I've played this year where at release there were major complaints about poor performance but I just opened the game, set ultra settings and got great performance, none of the issues mentioned in the media.

I know that's not very helpful but that's the reality, developers are optimizing games for the hardware they developed the game on which is almost always the best money can buy - and that's new that's how PC games have always worked

LOL if you think devs are all on 4090's you really haven't a clue. I have seen devs using older than 20 series laptops/desktops to develop for ps4/5 and xbox.

Check this :- https://www.techpowerup.com/review/lords-of-the-fallen-performance-benchmark/5.html

4090 no rt in 4K can't even hit 60fps (54.6fps) and with RT on 45.2fps... Game is not going brrrrr.. more like pc is going yikesssss... and doesn't even look that good. Then you have the racing game Motorsport 2023 hammering GPUS too.. more to come, 4090 is already dead at 60fps 4K what the card was highlighted to be, then people buying high refresh monitors with it and can only get under 60fps now, what is the point of over 120hz monitors and now we even have 500hz+ when an almost £1600-£2000 graphics card can't do 60fps... It's all got silly and all the added bloat to these games is just creating a never ending GPU upgrade cycle now and first 6-12months they work well and after we are back at 60fps...

Like the so called 120fps consoles that are now at 30fps high quality settings and 60fps performance mode (they made people go out and update their 4K tvs to 120hz panels for no good reason as their previous 60fps tvs would have been fine)... but were sold as 8K consoles.. and even all over their box ps5... yes 8K at 5fps... That's why they need a pro model for ps5 and xbox and if they don't have one they better bring out the new gen earlier ps6 and new xbox as all new AAA games will end up at 30 fps performance mode too. (digital foundry .. said we don't need a PS5 PRO LOL... and we don't need more storage on the PS5 because Richard knew it was only going to be 1TB from the 825GB built in on the new models coming out... where it should have had a 2TB NVME added and another NVME slot without the storage soldered to the mainboard that will in time fail as all solid state storage does).. Storage is so cheap now that Sony could have added 2TB and the cheek the new model is a cost cut model to make and they raised the price too... ohh look we gave you 200GB to make up for it...that is still soldered to the mainboard and in time will kill the unit with the never ending huge updates to the games and firmware.. planned obsolescence right there.


Also Intel did a Nvidia other day with their cpus, they changed the number 3 to 4 and upped the price for the same thing.. 13900k to 14900k... same chip with a slight oc and more power use too.. for 1% difference just to keep the price high... Nvidia did the same with the midrange gpus and some were even worse than their 30 series on the 40 series and again higher prices for hardly a difference or worst performance.. Seems to be the new game now change some numbers up on the range and hope the customer doesn't realise.. All showing their true colours now, even Sony and I will not even get into Microsoft's new games on Windows currently and coming windows 12 and all the AI bloat and data collection they are now doing on everyone's PC and any MS online services you use.. Like I said before the industry has hit an all time low and seems to be run by greed now and zero respect for their customers privacy or data.
 
Last edited:
^^ learn to use spacebar

As for claims game developers would be using old entry level laptops like a mobile 2060, that's under the required spec to develop on unreal engine 5 so either you're talking about some tiny studio working on old tech or that's years ago and you're just looking for a fight. I think we're all talking about AAA graphical boundary pushing games as that's where the performance complaints come from; no one is complaining about the performance of counter strike 2
 
Last edited:
Consoles aren't quite as well "optimised" as people are making out here....

- they are using adaptive resolution and sometimes even drop to silly res like 480p (iirc in forspoken), yes pc gamers are using upscaling and technically running at lower res too but console is using a much lower base res so it's not quite all smooth sailing here for any platform in the "native" res department
- they cut back on RT effects big time, spiderman 1 shows this well with how much devs could dial up the effects for Pc
- they cut back on all graphical effects as per DF comparisons, most of the time, they are using a mix of medium, high and low and at times, settings even lower than low

And all whilst more often targeting 30 fps or 60 fps if you're lucky.

The biggest con right now with pc is the lack of proper direct storage implementation, it is an after thought and holding back pc with what is capable on consoles.
 
And it could be argued, hardware is ready for mainstream RT as proven by likes of metro ee and now Spiderman 2 on the ps 5 (no option to disable RT) but Devs aren't willing to do the full switch over yet.

- they are using adaptive resolution and sometimes even drop to silly res like 480p (iirc in forspoken), yes pc gamers are using upscaling and technically running at lower res too but console is using a much lower base res so it's not quite all smooth sailing here for any platform in the "native" res department
- they cut back on RT effects big time, spiderman 1 shows this well with how much devs could dial up the effects for Pc
- they cut back on all graphical effects as per DF comparisons, most of the time, they are using a mix of medium, high and low and at times, settings even lower than low
So RT isn't yet ready for mainstream. Metro EE and Spiderman 5 therefore work, but only because they're massively cut back. Native resolution is not all smooth sailing, but the boat at least floats.

Couple of generations time and we may be there, but we'll have moved on to the next next-gen technique by then.
 
So RT isn't yet ready for mainstream. Metro EE and Spiderman 5 therefore work, but only because they're massively cut back. Native resolution is not all smooth sailing, but the boat at least floats.

Couple of generations time and we may be there, but we'll have moved on to the next next-gen technique by then.

Again, see metro ee and spiderman 2..... :cry: Metro ee is definitely not heavily cut back, outside of the path tracing titles, it is by far and away in the lead for RT usage. 4a enhanced stated it well when they said there is a lot of untapped power in the consoles for RT but again, it comes down to the devs to get the best from said hardware.

RT is just simply a graphical feature and shouldn't really be viewed as a separate thing anymore imo, same way we don't really view things like AO, tessellation, SSR as separate graphical settings especially when we have games out now with no RT that are running even worse than games with RT......
 
And I doubt we will have moved onto a "new" technique either, only the effects will be increased as hardware improves and/or more efficient methods are found to increase efficiency, much like what happened with raster.
 
Idk how RT isn’t ready. Must have been imagining things when i played cp2077 pathtraced at 3440x1440 100+ fps. Seemed pretty playable to me.

Soon alan wake 2 also pathtraced same res prolly same fps.

Seems normal? You pay 500$, you either get a ps5/xbox that doesnt even do pathtracing and barely does RT while upscaling from 720p to 4k in UE5 games and most games apart from 1st party titles, a pc gpu that does the same while at least offering more options ( better upscaling via dlss, FG ) or you buy top end and get to enjoy next-next-gen now instead of limited RT-cant-even-notice-effects-but-at-least-it-works-on-amd.

Also, native is dead. Has been for a while and honestly? Doesn’t matter at all. Ai upscaling is better because it allows the actual part of graphics that matters to be pushed forward. You can have a real life photo in 1080p vs an ingame nonPT shot in 16k. Which would you rather see in a game? Can accept it or go on doing the ‘old-man-yelling-at-cloud’ impression.
 
Last edited:
@tommybhoy if you know you know. Its still doing a great job thanks for asking.

4090 no rt in 4K can't even hit 60fps (54.6fps) and with RT on 45.2fps... Game is not going brrrrr.. more like pc is going yikesssss... It's all got silly and all the added bloat to these games is just creating a never ending GPU upgrade cycle now and first 6-12months they work well and after we are back at 60fps...

Depends on your resolution and games but you have to compliment with a cpu that's up to the job too. There's only so much you can blame on the game devs but if people keep demanding the rays.. with a finite timespan its never really a good recipe.

So RT isn't yet ready for mainstream.
...
Couple of generations time and we may be there, but we'll have moved on to the next next-gen technique by then.

Seems like when we were posting this back in 2021. :)
 
Also, native is dead. Has been for a while and honestly? Doesn’t matter at all. Ai upscaling is better because it allows the actual part of graphics that matters to be pushed forward. You can have a real life photo in 1080p vs an ingame nonPT shot in 16k. Which would you rather see in a game? Can accept it or go on doing the ‘old-man-yelling-at-cloud’ impression.

I so loved Bryans answer to the anti-fake crowd "well it could be argued raster is a bag of fakeness and more fake than DLSS+FG+PT+RR" :cry: This is the image in my head of them lot:

H5i4BZn.gif


:D

@tommybhoy if you know you know. Its still doing a great job thanks for asking.

Define great job? As where a 3080 is generally struggling with "ultra" settings, using dlss and so on, a 3090 is also struggling to the same extent unless you consider <60 fps to be playable.... The only gpus to really be doing well nowadays if you want "ultra" settings is 4090, 4080 and arguably a 7900xt(x) for certain titles.

Idk how RT isn’t ready. Must have been imagining things when i played cp2077 pathtraced at 3440x1440 100+ fps. Seemed pretty playable to me.

Soon alan wake 2 also pathtraced same res prolly same fps.

AW 2 will be an interesting one since it is very closed of/linear and not huge open dense world like CP 2077 so I would expect it to be performing better than cp but alas, more than likely won't.
 
I so loved Bryans answer to the anti-fake crowd "well it could be argued raster is a bag of fakeness and more fake than DLSS+FG+PT+RR" :cry: This is the image in my head of them lot:

H5i4BZn.gif


:D



Define great job? As where a 3080 is generally struggling with "ultra" settings, using dlss and so on, a 3090 is also struggling to the same extent unless you consider <60 fps to be playable.... The only gpus to really be doing well nowadays if you want "ultra" settings is 4090, 4080 and arguably a 7900xt(x) for certain titles.



AW 2 will be an interesting one since it is very closed of/linear and not huge open dense world like CP 2077 so I would expect it to be performing better than cp but alas, more than likely won't.
Doubt it will perform better tbh. Cp is openworld yeah whereas alan is linear but the amount of detail cramped into any section in aw2 will be much higher from what we’ve seen so far, so more computationally expensive. Also consider it will have lots of small moving objects with all kinds of shadows/lights so the denoising will be more expensive, especially considering how ray reconstruction’s main weak point in cp2077 is exactly this ( the small papers flying through the air/bottles on the ground ghosting ).

From nv’s promotional video it was at 100 fps with dlss and fg at 4k. So pretty much inline with cp2077 with pathtracing.

Just finished alan wake remastered again as a refresher and man, even back in 2012 they had crazy stuff going on technically. Even more hyped for this, its gonna be a technical marvel for sure.
 
Last edited:
Well, in Cyberpunk you can drive a tank over a puddle, it would still not move at all. So mirror is an accurate description. An impenetrable mirror, but a mirror still. :)

I'm not sure on the the cars bit but there is definitely water physics/interaction when walking and shooting in water so wouldn't say this was a limitation on RT front (before it was added, was this possible with SSR?) but more just devs couldn't be bothered or/and didn't have time to add those details.

Doubt it will perform better tbh. Cp is openworld yeah whereas alan is linear but the amount of detail cramped into any section in aw2 will be much higher from what we’ve seen so far, so more computationally expensive. Also consider it will have lots of small moving objects with all kinds of shadows/lights so the denoising will be more expensive, especially considering how ray reconstruction’s main weak point in cp2077 is exactly this ( the small papers flying through the air/bottles on the ground ghosting ).

From nv’s promotional video it was at 100 fps with dlss and fg at 4k. So pretty much inline with cp2077 with pathtracing.

Just finished alan wake remastered again as a refresher and man, even back in 2012 they had crazy stuff going on technically. Even more hyped for this, its gonna be a technical marvel for sure.

Yeah true that, not as pen but more detail
 
Consoles aren't quite as well "optimised" as people are making out here....

- they are using adaptive resolution and sometimes even drop to silly res like 480p (iirc in forspoken), yes pc gamers are using upscaling and technically running at lower res too but console is using a much lower base res so it's not quite all smooth sailing here for any platform in the "native" res department
- they cut back on RT effects big time, spiderman 1 shows this well with how much devs could dial up the effects for Pc
- they cut back on all graphical effects as per DF comparisons, most of the time, they are using a mix of medium, high and low and at times, settings even lower than low

And all whilst more often targeting 30 fps or 60 fps if you're lucky.

The biggest con right now with pc is the lack of proper direct storage implementation, it is an after thought and holding back pc with what is capable on consoles.
I Don,t know if you are correct or not but I have completed so many games on console over many years. If they are doing as you say its very well done as sitting back on my couch to my eyes I have not noticed. Most games I play on Ps5/Series x are SP RPG. Usually they have a 60fps mode as well. I don't sit and analyse every single frame, I play the game and most have looked great. I think gamers like you forget the art of gaming is to enjoy the experience. Go play Gow Ragnarok and stop looking for issues with a games visuals. I play on all systems and can tell you most ps5 exclusives look better than most PC games while rarely ever crashing. Even load times on both top consoles are top notch.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom