• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Is RT stuff restricted to DX12/Vulkan or is it doable in DX11 too?
Now I realise I'm not really up-to-date on games, but the few games I play that have DX12 or Vulkan renderer options seem to not recommend them. They seem to be experimental or for Linux. I'm just wondering if the uptake in things like RT are affected by the uptake of DX12/Vulkan?
 
Ray tracing doesn’t work with DirectX 11.

DirectX 12 is newer and the current in-development update, so could be considered experimental depending on context, and is required for newer features.

Here’s a developer blog that explains some of the reasons why DirectX 12 is important.

 
Last edited:
Is RT stuff restricted to DX12/Vulkan or is it doable in DX11 too?
Now I realise I'm not really up-to-date on games, but the few games I play that have DX12 or Vulkan renderer options seem to not recommend them. They seem to be experimental or for Linux. I'm just wondering if the uptake in things like RT are affected by the uptake of DX12/Vulkan?
You can do (at least some of it) in DX11, too. Crysis remasters work in dx11.
 
What happened? Who was it?

Even I feel RT is the way of travel. (The more I play Avatar the more I realise we can’t turn it off!?) I’ll be buying a new gpu once RT is predominant in games.

Basically he couldn't tell the difference between RT on and off. Which is utterly a shame as he would have seen reflection of the ban hammer coming down and potentially dodged it ;p
 
Doom Eternal really is the benchmark for how ray tracing is supposed to be done, everything metal looks metal, every shiny metal surface reflects everything in high detail, there's no RT lighting delay either in those reflections from rockets/explosions and stuff, and the framerate remains high at all times. It was over 100fps on my 3080 Ti at 3440x1440, and it's ~200fps on 4K on the 4090.

It's just supremely excellent and looks great in motion.


wejGuuf.jpeg


9LmkBdJ.jpeg


The specular highlights and reflections off flesh is nice too, sort of subsurface scattering:

9FUyTkw.jpeg


DF's tech look at RT in Eternal from 3 years ago:


It's kind of funny how no ohter game since then has managed to get this level of optimisation, iD Tech are masters in this area.

ID Software is a studio with plenty of experience which probably affords to attract and retain the best talent out there. If we go by the video posted on the "greed is killing your favorite games" (or something like that), the "life expectancy" of a dev in the gaming industry is around 5 years. No wonder you don't have a huge number of people who are very, very good at their job on your average studio. Perhaps they're not forced to launch incomplete products, too!

Then they do their own engine, so support plus R&D is continuous and in house. You know ahead of time what will be and plan accordingly. Jumping on the UE5 bandwagon without a proper "dos and dont's" will lead to issues, especially when you're pushed to release a product that has no business being on the market at that stage of development. I'd say the above are the most critical parts of why Doom was running that well and is the reason for which is impossible for the mainstream studio to do it - yes, even EA, Ubi, whatever included. CDPR as well.

Not the least, Doom is relative simple compared to a city like in CB2077. On top, I think it only does reflections combined with SSR and that's about it. The closest that comes to mind is Metro Exouds EE which runs pretty well. And again, an in house engine. Funny enough, running just RT reflections and adding Ray Reconstructions adds a significant loss in performance in CB2077.

BTW, Alex from DF said a few times that the extreme number of polygons that UE5 can deliver does pose problems for RT - not feasible to trace against and how the state of the world changes every so often has its own headaches.

Anyway, the main reason for "faulty" performance will be the maximization of profit over maximization of quality.

Reflections do come in handy, if you can see them
:p
Well, he did like Raster, ergo SSR got him blindsided. :D
 
Last edited:
Shame AMDMatt isn't around in this forum anymore, not as interesting without him, IMO.
 
Doom Eternal really is the benchmark for how ray tracing is supposed to be done, everything metal looks metal, every shiny metal surface reflects everything in high detail, there's no RT lighting delay either in those reflections from rockets/explosions and stuff, and the framerate remains high at all times. It was over 100fps on my 3080 Ti at 3440x1440, and it's ~200fps on 4K on the 4090.

It's just supremely excellent and looks great in motion.


wejGuuf.jpeg


9LmkBdJ.jpeg


The specular highlights and reflections off flesh is nice too, sort of subsurface scattering:

9FUyTkw.jpeg


DF's tech look at RT in Eternal from 3 years ago:


It's kind of funny how no ohter game since then has managed to get this level of optimisation, iD Tech are masters in this area.

How far am i from these scenes?

This is how far i got, some reflections here but not that much... still getting about 170 FPS, 1440P didn't seem to make any difference to performance.

Also... Auto HDR tone mapping :(

 
I'm 5.7 hours in at that point, I can't remember the above you're on there, foggy memory but I think probably about an hour behind or so if the timings align :o
 
Shame AMDMatt isn't around in this forum anymore, not as interesting without him, IMO.
Some say this person's identity is shrouded in secrecy and mystery never to be mentioned.
He's banned from this subsection, for whatever reason how some others still have access is beyond me.
Despite that he still got his access removed so that says a lot.
 
Last edited:
Still playing with the latest DLSS files, and found another game that now runs nicely at 4K with ray tracing.
Same scene without ray tracing.
I think the concept of low and high doesn’t really work with ray tracing for this game either, and higher levels seem to only make things more shiny.

Also quite like the way this game uses mostly mesh and alpha texture based models. Works well with ray tracing, and seems to be quite common in next-gen games using Unreal Engine.

I’m not overly interested in very sharp textures anymore, especially when using ray tracing. Much rather prefer a stutter free experience. :p

Edit.

Hogwarts Castle
 
Last edited:
Bringing this conversation here from the rdna4 rumour thread as my reply was deleted for going off topic-fair enough, so airing my thoughts in this relevant thread instead.

Very true. I regretted getting a used 4080 for £900, let alone the £1270 they were looking for them new.

Trying my son’s 7900 XT at 1400p with RT on in Wukong with FSR and FG is actually not terrible experience, yet for him it is perfectly playable. My 4080 is obviously a lot faster at that res and settings without FG. Having said that the RT on vs off is still not worth the FPS hit on either GPU IMHO. Anyone convincing themselves RT is just amazing are delusional IMHO. I have ran on vs off back to back for well over a few hours and always come away concluding that RT off with better FPS and no FG is a much more fluid experience. Even when capped at 60 FPS.

Oh and FSR is perfectly useable now, not as good as DLSS but hardly another night and day difference.

So for the majority the Nvidia tax for the “premium” features is just pure BS IMHO. Both Nvidia and AMD need to get realistic with their prices.
Not everyone want's to run RT'ing, some do, RTX is gospel in here, no problem with that, but hardly anyone runs 4080/90's, screenshots in full glory are all well , but it's just a screenshot at the end of the day, I'm playing a game, not interested in shiny for the massive performance cost but that's just me.

Before I go on-I can't stress this enough, as a user of both vendors gpus, DLSS is a far superior tech than FSR, but when it takes tech vids to slow to~25% playback(even seen 5%) using performance at whatever zoom they choose to show you how bad it looks is priceless.

My 4070 runs CP better with the FSR3 FG mod, it has less ghosting than DLSS3 in finely tuned settings, but it stutters unless you use the mod as Nv FG vram impact sends it over the edge.

The reason imo, DLSS is so popular because it's now a requirement for anything under a 3090 due to the imo planned obsolescence vram allocation provided at the time, my 3070 was toast and the 3080 was just as bad, it's not a dig but it's extending the life of the low vram cards and most Nv users positively have to use it if they want to run RT'ing.

The difference is real but, it's nowhere near as bad as claimed-most of the gaming community are in the game, they aren't grinding to a halt because they saw a shimmering cable and wish they bought that Nv gpu, they are engrossed, they are the game, and why we are PC gamers, play it the way you want to.

If it's full RTX fine, if not that's just as fine too.
 
You're forgetting that whether someone prefers RT or not, they simply have zero choice. UE5 games currently releasing all use RT via Lumen, whether software or hardware doesn't matter, RT is on by default and you cannot turn it off.

Also, fact of the matter is that gamers need to rewire their brains and forget the past, some folks cannot let things go.

Watch DF's latest video from 1:59:04 onwards for a professional insight into the whole scene related to RT, upscaling.


RT was the future back in 2020, it is now the future we are living, and from here onwards more advanced RT is the current trend which is where dedicated and efficient RT hardware comes into its own, so AMD/Intel best get their catchup shoes on as it's not going to slack.
 
You are confusing people saying they prefer higher FPS with people meaning they don’t like RT. What these people (including me) were arguing back all those years ago was the RT tax was not worth the FPS hit and that it would take years to become mainstream.

If RT is baked in to a game and it comes with limited impact, then great. In fact most RT games are like this and turning on RT isn’t sending the frame rates plummeting. The majority weren’t arguing RT is not the future and that games should never have it baked in. So the narrative that RT is no longer the future but here to stay is hardly the big predictive “win” you think it is.

In fact what DF seem to be saying (indirectly) about RT and the PS5 pro, is that AMDs approach to gradually build up RT performance to be ready for this “future” has worked well timing wise. The majority always claimed it would take years for RT to become mainstream and most current gen AMD cards are coping OK with “normal” levels of RT. RT hardware GPUs entered the scene in Sep 2018, six years ago, and some would argue current mainstream is still a bit short on the RT performance.

Wukong is a perfect example, it comes with RT baked in that is more than playable on mid range PCs, even from AMD. It can be tweaked to run well at 1080p or 1440p with little graphical impact and depending on your hardware may need upscaling to do it well. So getting there but not just quite there yet.

Where the RT tax comes in, is when you enable the more extreme RT, that when you are actually gaming rather than looking at screenshots, is just killing FPS. This is where the preference of FPS over “moar RT” eye candy comes in. Wukong PT enabled at high wrecks FPS, but the difference is arguably not worth it and a matter of preference.
 
Last edited:
I always find it funny when people say they prefer higher fps then they refuse to use upscaling etc. tech even though those technologies literally provide higher fps for little to no hit in IQ, meanwhile, can't see a difference with RT on/off or/and doesn't think it is worthwhile....

So the narrative that RT is no longer the future but here to stay is hardly the big predictive “win” you think it is.

It wasn't that long ago, we had a few folk saying there were hardly any RT games and it would be decade or so until RT would be in more games. I think the silience on this front from the usual suspects is very telling now and from reading amd reddit etc. there has been a big awakening especially with the recent gameshow.
 
I always find it funny when people say they prefer higher fps then they refuse to use upscaling etc. tech even though those technologies literally provide higher fps for little to no hit in IQ, meanwhile, can't see a difference with RT on/off or/and doesn't think it is worthwhile....



It wasn't that long ago, we had a few folk saying there were hardly any RT games and it would be decade or so until RT would be in more games. I think the silience on this front from the usual suspects is very telling now and from reading amd reddit etc. there has been a big awakening especially with the recent gameshow.
A decade? Don't remember that, but happy to be corrected. Thought most people were saying 2 generations or so, and that was about the 2000/3000 (Nvidia GPU) changeover time. It's getting to about that sort of time now, and we need to wait to see the impact of the 5000 series to see if the performance has filtered to the lower level of GPU for mass (or at least greater) adoption.
 
A decade? Don't remember that, but happy to be corrected. Thought most people were saying 2 generations or so, and that was about the 2000/3000 (Nvidia GPU) changeover time. It's getting to about that sort of time now, and we need to wait to see the impact of the 5000 series to see if the performance has filtered to the lower level of GPU for mass (or at least greater) adoption.

Exactly. This is nothing but straw man argument that nobody else made. Nobody said a decade before RT is in “more games” as that’s such a low bar.

The argument back since 2018 was it could be a decade before it would become the de facto standard and that mainstream GPUs and consoles could viably run it. This was a pessimistic view but the hope would be sooner of course.

Six (6) years ago and it’s only now that mid range GPUs are becoming a bit more affordable and viable in RT with upscaling. It’s getting there just as predicted but not quite there yet.

Edit. I know what constitutes mainstream and viable are is subjective. But a sub £300 GPU with at least 12GB VRAM and 4070 levels of raster and RT would be a decent minimum standard. Some may even consider that low.
 
Last edited:
A decade? Don't remember that, but happy to be corrected. Thought most people were saying 2 generations or so, and that was about the 2000/3000 (Nvidia GPU) changeover time. It's getting to about that sort of time now, and we need to wait to see the impact of the 5000 series to see if the performance has filtered to the lower level of GPU for mass (or at least greater) adoption.

There were loads saying that, not so much this thread but in other threads to do with RT/dlss etc.

IDCP post from this very thread not that long ago though:

Not sure if accurate but I would say the PS5 is roughly the equivalent of a 6700 XT in raster and RT. The 7800XT and 7900 GRE are almost double that more or less at 4K (RT and raster). So if PS5 Pro is going to be used on 4K TVs with RT, I think they need to aim higher. If they don’t expect most games to still aim for mid levels of RT and CP2077 levels to still be the very rare exception.

In other words RT at meaningful levels could still be almost a decade away.

Define what you mean by meaningful levels? I would argue we are and have already been at meaningful levels for the past 2-3 years both from a performance POV (if devs choose to do **** optimisation or/and still work on raster whilst shoehorning on RT, that is their fault rather than the tech itself being the one to blame) and visual difference POV (not every game has to be like cp 2077 PT, even games with little RT such as riftbreaker have meaningful impact to IQ)
 
Last edited:
  • Haha
Reactions: TNA
Back
Top Bottom