• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Right so I am trying out Boosteroid which is a cloud gaming service like GFN, it's kind of neat actually, uses AMD Epyc CPU + 7900XT GPU VMs so supports up to 4K 120fps. The streaming CODEC used is only h.264 though where it should be AV1 IMO for best quality and lower bandwidth use at the same time.

Here is Silent Hill f running off my Steam account (syncs to your library on all platforms) via it vs the same bit on my RTX 4090 (hardware Lumen enabled):

Biggest win is it's playing via a web browser lol.


fVME7hN.jpg


fVMEcBt.jpg


You can feel some mouse latency but with a controller it's a good experience really, one could easily enjoy gaming this way and it's a smooth framerate experience too especially given UE5, long as you can hack FSR :p


Are the settings the same? Top one looks really bad, not just low resolution but colours look washed out too. Top one looks like a 720p image, maybe lower
 
Last edited:
Whilst that is true, with AMD you will only get those on newer GPUs, not o older ones like you do with RTX with Ray Reconstruction and all the latest DLSS upscaling tech supported by all RTX cards.

It doesn’t matter if it technically works but in practice it is unplayable with all those features enabled. 2000 series RTX GPUs are just not viable options for PT games and anything below a 3080 struggles. Even an 3080 is only viable when you are using 1080p with balanced upscaling in PT games.

Considering how poor AMDs previous gen are at RT and especially PT, trying to support them would be pointless. FSR4 yes, totally agree.

It’s about getting PT and decent upscaling working on current and next gen GPUs but even more vital is getting it in next gen consoles. At that point RT will be well approaching the 9-10 year old point and finally at what many of us consider “mainstream”.
 
Yes, by 2030 I'm sure PT will be mainstream but by then we won't have to worry because there won't be consumer PC hardware anymore unless you sell a house to pay for it.
 
So true. Ironically we may be relying on AMDs lack of real traction on AI to be the saviour of gaming, because it is increasingly obvious Nvidia are focusing on the AI bubble.
 
Yeah this is mega annoying, I'm seen tech tubers do the same thing - talk about "4k" gaming but actually they're using dlss from 1080p or w/e
Like I've said, it doesn't really matter anymore. Beside, having "native" 4k, but lower quality effects or missing altogether doesn't seem to be a truly better option.
Are the settings the same? Top one looks really bad, not just low resolution but colours look washed out too. Top one looks like a 720p image, maybe lower
Top image is probably will software RT or no RT at all, the bottom is probably with hardware lumen.
It doesn’t matter if it technically works but in practice it is unplayable with all those features enabled. 2000 series RTX GPUs are just not viable options for PT games and anything below a 3080 struggles. Even an 3080 is only viable when you are using 1080p with balanced upscaling in PT games.

Considering how poor AMDs previous gen are at RT and especially PT, trying to support them would be pointless. FSR4 yes, totally agree.

It’s about getting PT and decent upscaling working on current and next gen GPUs but even more vital is getting it in next gen consoles. At that point RT will be well approaching the 9-10 year old point and finally at what many of us consider “mainstream”.
Metro Enhanced Edition ran around 60fps on consoles. it wasn't path traced as Cyberpunk is for instance, but RTGI was great. I've gave it a go in the starting section yesterday and I'm amazed in how well it looks and runs. Seems miles better than software lumen in older versions of UE5 (like Stalker 2) in both performance and looks.

Next gen consoles for sure will run at the very least with a version of RTGI by default. It makes life so much easier for game devs.
 
@Calin Banc viewed it FS @1440p-your screenshot comparison is still terrible imo because of the reasons I stated, yes the gun/buttons stand out, but more details are lost to blurring than a few that stand out-there's much better examples to show how good RT/PT'ing is better than native like the @Dicehunter example.

@mrk can you teach @Calin Banc how to take some decent comparison screenshots plz?:p

@TNA you know who spammed the rtm button, but deleted posts have fell off a cliff in gfx sub since nexus got red carded from the gpu section.:p
Leaving aside a natural bloom, I don't see what you're seeing.

However, look at the 2nd set of comparison there, with the lady. I'm expecting to say she's too blue and dark with PT :))
 
Last edited:
I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.

Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.

I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!
 
I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.

Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.

I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!

Resistance is futile? :p

I shall resist!! :D

I do think eventually it will go in that direction. But i also feel with help of AI smaller teams will be needed to make very good indie games that will still run on older hardware.

I can then dip in and play must play stuff now and then on steaming if there is no other way. But i can't see myself having a reoccurring monthly sub like netflix for gaming.
 
I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.

Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.

I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!

You're probably right, and i hate it......

No matter how good the image quality and latency may get, and i have no doubt that GeForce Now is already better than what @mrk posted the fact remains you don't own it, you're depending on someone else's computer to provide your rendering power over an internet connection.

"You will own nothing, have no privacy and be happy" No **** off!

Besides that if you rent an RTX 4080 over 5 years at 4 hours per day that RTX 4080 will cost you nearly $2000, the price of that will only ever go up especially as it gets more users, but that's ok in a few years you can chose to pay either £3500 for an equivalent GPU OR £2500 with in game advertising......

"We interrupt you lining up that marvellous headshot to bring you this massage from our sponsors, Betfred, Bet365 and 24/7 Babes XXX"

Mark my words.

 
Last edited:
Most games are already a service these days it's only a matter of time until the hardware followed the same path. With all these monopolies in place across the chain from ASML, TSMC and Nvidia etc I just can't see when and the situation will improve. If Intel and Samsung get their stuff together that might help by giving the designers more options but that's a big if.
 
I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.

Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.

I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!
Devs optimize for consoles first, anything else is usually the low hanging fruit or is done as it will bring extra revenue.

If streaming quality and latency aren't issues, then you can also keep your PC longer where you're not limited by the catalogue of games, performance, mods,etc.

AI bubble will probably pop next year or the year after. If not and AGI is reached, we'll have bigger problems. Well, bubble popping will be a huge thing, too, but it is what it is.

After crypto and Covid I thought people were kinda used to this type of mess...
 
I'm guessing with the cloud gaming services you can't use mods?
Might as well get a console. Which I guess will have a total of 2GB RAM/VRAM next gen as all the RAM will be used for AI and cloud gaming services...
 
Note that on these cloud based services you cannot mod the game files to enable additional features, the ingame settings are the same but the local version on RTX has hardware lumen enabled which delivers much better dynamic contrast and Lumen AO as well as Lumen reflections - I just wanted to highlight the best of streaming games (and consoles lol) vs the same game on PC hardware!
 
Back
Top Bottom