Ah... When people tell me they are getting 60 FPS at 4K i'm assuming that's 4K, not upscaled from 1080P.
Yeah this is mega annoying, I'm seen tech tubers do the same thing - talk about "4k" gaming but actually they're using dlss from 1080p or w/e
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Ah... When people tell me they are getting 60 FPS at 4K i'm assuming that's 4K, not upscaled from 1080P.
Right so I am trying out Boosteroid which is a cloud gaming service like GFN, it's kind of neat actually, uses AMD Epyc CPU + 7900XT GPU VMs so supports up to 4K 120fps. The streaming CODEC used is only h.264 though where it should be AV1 IMO for best quality and lower bandwidth use at the same time.
Here is Silent Hill f running off my Steam account (syncs to your library on all platforms) via it vs the same bit on my RTX 4090 (hardware Lumen enabled):
Biggest win is it's playing via a web browser lol.
![]()
![]()
You can feel some mouse latency but with a controller it's a good experience really, one could easily enjoy gaming this way and it's a smooth framerate experience too especially given UE5, long as you can hack FSR![]()
Whilst that is true, with AMD you will only get those on newer GPUs, not o older ones like you do with RTX with Ray Reconstruction and all the latest DLSS upscaling tech supported by all RTX cards.
Like I've said, it doesn't really matter anymore. Beside, having "native" 4k, but lower quality effects or missing altogether doesn't seem to be a truly better option.Yeah this is mega annoying, I'm seen tech tubers do the same thing - talk about "4k" gaming but actually they're using dlss from 1080p or w/e
Top image is probably will software RT or no RT at all, the bottom is probably with hardware lumen.Are the settings the same? Top one looks really bad, not just low resolution but colours look washed out too. Top one looks like a 720p image, maybe lower
Metro Enhanced Edition ran around 60fps on consoles. it wasn't path traced as Cyberpunk is for instance, but RTGI was great. I've gave it a go in the starting section yesterday and I'm amazed in how well it looks and runs. Seems miles better than software lumen in older versions of UE5 (like Stalker 2) in both performance and looks.It doesn’t matter if it technically works but in practice it is unplayable with all those features enabled. 2000 series RTX GPUs are just not viable options for PT games and anything below a 3080 struggles. Even an 3080 is only viable when you are using 1080p with balanced upscaling in PT games.
Considering how poor AMDs previous gen are at RT and especially PT, trying to support them would be pointless. FSR4 yes, totally agree.
It’s about getting PT and decent upscaling working on current and next gen GPUs but even more vital is getting it in next gen consoles. At that point RT will be well approaching the 9-10 year old point and finally at what many of us consider “mainstream”.
Leaving aside a natural bloom, I don't see what you're seeing.@Calin Banc viewed it FS @1440p-your screenshot comparison is still terrible imo because of the reasons I stated, yes the gun/buttons stand out, but more details are lost to blurring than a few that stand out-there's much better examples to show how good RT/PT'ing is better than native like the @Dicehunter example.
@mrk can you teach @Calin Banc how to take some decent comparison screenshots plz?
@TNA you know who spammed the rtm button, but deleted posts have fell off a cliff in gfx sub since nexus got red carded from the gpu section.![]()
)I know..... i can read you knowTop is AMD GPU and bottom Nvidia![]()
Mostly....I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.
Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.
I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!


I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.
Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.
I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!
Devs optimize for consoles first, anything else is usually the low hanging fruit or is done as it will bring extra revenue.I think streaming is going to be the future. There’s no point in developers optimising games to work on hardware that only the rich can afford and the latency issues should be nominal with a controller.
Personally I don’t feel like I’m missing anything streaming 4K content for films etc., probably because I’m not pixel peeping.
I’ve realised that I feel motion sick really quickly on first person games when using a keyboard and mouse… so I can’t make best use of ultra high fps with a mouse anyway d’oh!
If not and AGI is reached
