• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Can't agree with the praise for PT in AW2; I thought he was joking at first, besides RT reflections the other changes are actually still very limited (which the devs talked about so it's no secret). I'm also mixed on it in ME:EE since it shifts the tone in multiple scenes for the worse imo (vs RT ultra), not to mention the disgusting & obvious temporal accumulation without which the mode could not exist (see gif), but not gonna relitigate that. Also would have to severely disagree with his assessment for several titles or for effects overall, particularly things like RTAO - once you notice the glowy nature of lighting in games without a very robust GI system (even raster), you'll pray to the gods for even a meagre RTAO implementation to give you at least some grounding for objects in the world. A recent release (on PC) which is otherwise quite good looking but suffers harshly from a lack of GI is Ghost of Tsushima.

Overall the number of worthwhile titles for RT is still quite low but I would say that sometimes all it takes is one to make it worthwhile. For me Cyberpunk 2077 alone would be enough to drive a decision for a card with better RT (plus my playtime in that one is probably greater than all the other RT titles combined). But tbh there's plenty of other titles I greatly enjoy RT in and still play from time to time, like WD:L, Metro Exodus, Riftbreaker, TW3, The Ascent, and others that I play more rarely and which are somewhat controversial vis-a-vis their RT (f.ex. Shadow of the Tomb Raider).

Plus, a lot of the less noticeable RT implementations he talks about can be tweaked (ini or unlockers) to be actually very noticeable. One obvious example that comes to mind is Doom Eternal, where the roughness cut-off is quite sharp but once you unlock it it actually becomes waaay more obvious in general (and correspondingly murders the performance); or one I don't really play - Hogwarts Legacy (but UE titles more generally due to their openness).


gclpy6X.gif
 
Can't agree with the praise for PT in AW2; I thought he was joking at first, besides RT reflections the other changes are actually still very limited (which the devs talked about so it's no secret). I'm also mixed on it in ME:EE since it shifts the tone in multiple scenes for the worse imo (vs RT ultra), not to mention the disgusting & obvious temporal accumulation without which the mode could not exist (see gif), but not gonna relitigate that. Also would have to severely disagree with his assessment for several titles or for effects overall, particularly things like RTAO - once you notice the glowy nature of lighting in games without a very robust GI system (even raster), you'll pray to the gods for even a meagre RTAO implementation to give you at least some grounding for objects in the world. A recent release (on PC) which is otherwise quite good looking but suffers harshly from a lack of GI is Ghost of Tsushima.

Overall the number of worthwhile titles for RT is still quite low but I would say that sometimes all it takes is one to make it worthwhile. For me Cyberpunk 2077 alone would be enough to drive a decision for a card with better RT (plus my playtime in that one is probably greater than all the other RT titles combined). But tbh there's plenty of other titles I greatly enjoy RT in and still play from time to time, like WD:L, Metro Exodus, Riftbreaker, TW3, The Ascent, and others that I play more rarely and which are somewhat controversial vis-a-vis their RT (f.ex. Shadow of the Tomb Raider).

Plus, a lot of the less noticeable RT implementations he talks about can be tweaked (ini or unlockers) to be actually very noticeable. One obvious example that comes to mind is Doom Eternal, where the roughness cut-off is quite sharp but once you unlock it it actually becomes waaay more obvious in general (and correspondingly murders the performance); or one I don't really play - Hogwarts Legacy (but UE titles more generally due to their openness).


gclpy6X.gif
I think it's running on something like 60fps on consoles? It's to be expected to have some drawbacks. Sadly, there's no option in menu the further tweak the settings - as increasing the quality.
Fortunately, those type of situations aren't that often - smaller light sources are less "annoying" + the situations is kinda "resolved" on nVIDIA's implementation with Ray Reconstruction

Overall, I'd rather have the Metro EE system in every game than just the simple raster.
 
Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers

Generational gains in raster 3D graphics rendering performance at native resolutions remain eminently desirable for anyone following the PC hardware industry for decades now. With Moore's Law in place, we've been used to near-50% generational increases in performance, which enabled new gaming APIs and upped the eye-candy in games with each generation. Interestingly, ray tracing performance takes a backseat, polling not even 3rd, but 4th place, at 10.4% or 2,475 votes. The 3rd place goes to energy efficiency.

The introduction of 600 W-capable power connectors presented ominous signs of where power was headed with future generations of GPUs as the semiconductor fabrication industry struggles to make cutting edge sub 2 nm nodes available, which meant that for the past 3 or 4 generations, GPUs aren't getting built on the very latest foundry node. For example, by the time 8 nm and 7 nm GPUs came out, 5 nm EUV was already the cutting-edge, and Apple was making its iPhone SoCs on them. Both AMD and NVIDIA would go on to make their next-generations on 5 nm, while the cutting-edge had moved on to 4 nm and 3 nm. The upcoming RDNA 4 and GeForce Blackwell generations are expected to be built on nodes no more advanced than 3 nm, but these come out in 2025, by which time the cutting edge would have moved on to 20 A. All this impacts power, which a performance target wildly misaligns with foundry node available to GPU designers.

Our readers gave upscaling and frame-gen technologies like DLSS, FSR, and XeSS, the least votes, with the option scoring just 2.8% or 661 votes. They do not believe that upscaling technology is a valid excuse for missing generational performance improvement targets at native resolution, and take any claims such as "this looks better than native resolution" with a pinch of salt.

All said and done, the GPU buyer of today has the same expectations from the next-gen as they did a decade ago. This is important, as it forces NVIDIA and AMD to innovate, build their GPUs on the most advanced foundry nodes, and try not to be too greedy with pricing. NVIDIA's competitor isn't AMD or Intel, but rather PC gaming as a platform has competition from the consoles, which are offering 4K gaming experiences for half a grand, with technology that "just works." The onus then is on PC hardware manufacturers to keep up.


Also watched the posted HUB RT vid, it's pretty telling that only 3 games running R/PT'ing are game changing, although only one of them is 'doable' without needing a 4080 or above.

If you can run it, yay, of course it's great but that's not, mainstream, 30/4060 can't get near it, which has been my pov on adoption all along.

Imo, we are still a gen or two away for RT'ing to be readily usable on mainstream, but that'll be down to consoles providing the means.
 
Its obvious when you think about it.

The whole RT thing is driven by Nvidia and they only do it to justify £1000+ GPU's so it should be no surprise to anyone, really, that only about 10% of people have a firm interest in it.

It doesn't need to be like this, if RT is simply used for accurate reflections and clean up some of what's wrong with rasterised lighting RT has no need to be so GPU heavy, it really doesn't.

RT has been poisoned by Nvidia's incessant need to push profit margins relentlessly higher, poisoned by their need to use it as a marketing tool to push ever increasingly over priced GPU's.
That has been so effective that when something does come along that does run RT well on mainstream level GPU's its disqualified as 'not proper RT' in the minds of most consumers simply by virtue of a £500 GPU running it well.

RT will not become mainstream, a replacement to raster for as long as Nvidia see it as a cash cow because for as long as that is true they will never stop injecting themselves in to game studios for the purpose of RTX branding and turning RT up to 11 as to shift more £1000+ GPU's.

For once Nvidia give us a ******* break will you????
 
Last edited:
It doesn't need to be like this, if RT is simply used for accurate reflections and clean up some of what's wrong with rasterised lighting RT has no need to be so GPU heavy, it really doesn't.
I really like to see a game where you can do reflections 1:1 and RTGI with ease. AMD hasn't "done" one yet (as in sponsor it) since it's in their own interest.

Thing is... once you starting adding up and increase the quality, raster isn't that much faster anymore.
 
Last edited:
Its obvious when you think about it.

The whole RT thing is driven by Nvidia and they only do it to justify £1000+ GPU's so it should be no surprise to anyone, really, that only about 10% of people have a firm interest in it.

It doesn't need to be like this, if RT is simply used for accurate reflections and clean up some of what's wrong with rasterised lighting RT has no need to be so GPU heavy, it really doesn't.

RT has been poisoned by Nvidia's incessant need to push profit margins relentlessly higher, poisoned by their need to use it as a marketing tool to push ever increasingly over priced GPU's.
That has been so effective that when something does come along that does run RT well on mainstream level GPU's its disqualified as 'not proper RT' in the minds of most consumers simply by virtue of a £500 GPU running it well.

RT will not become mainstream, a replacement to raster for as long as Nvidia see it as a cash cow because for as long as that is true they will never stop injecting themselves in to game studios for the purpose of RTX branding and turning RT up to 11 as to shift more £1000+ GPU's.

For once Nvidia give us a ******* break will you????
Nv is where they are now with innovation they are always first rolling out the next step in making the best gaming experience money can buy and leather jacket man (rightly so)is KING!


The evolution from Physx/tesselation/GW's and now RTX is genius from a Nv pov imo, he has created an Nv juggernaut in the tech industry.

Yes it's insane the prices commanded to run the heavy hitters, but the hype RTX has created in this sub over the years is incredible, although the hypes went all of a bit subdued lately.:thumbsup
 
Last edited:
Imo, we are still a gen or two away for RT'ing to be readily usable on mainstream, but that'll be down to consoles providing the means.

Yes it's insane the prices commanded to run the heavy hitters, but the hype RTX has created in this sub over the years is incredible, although the hypes went all of a bit subdued lately.:thumbsup

It was never mainstream after we witnessed the Turing hardware and the promise. Not quite sure why people assumed a 2060 was going to cut it but that end of the stack is not really fair on the outset. However as each generation steps out, its quite clear its not meant for the mainstream.. at least not yet. I am eager to see how the PS5 pro handles this with whatever AMD tech mangled out, because lets face it if they can pull off some of what its meant to do it may, just may mean the time has come for some "mainstream" hardware to be able to showcase it!

Seems to have gone quieter in the past couple of months, which is kind of interesting. :)
 
Nv is where they are now with innovation they are always first rolling out the next step in making the best gaming experience money can buy and leather jacket man (rightly so)is KING!


The evolution from Physx/tesselation/GW's and now RTX is genius from a Nv pov imo, he has created an Nv juggernaut in the tech industry.

Yes it's insane the prices commanded to run the heavy hitters, but the hype RTX has created in this sub over the years is incredible, although the hypes went all of a bit subdued lately.:thumbsup

Tessellation was invented by ATI, PhysX by 3DFX, modern Screen Space GI by AMD.

Ray Tracing is as old a graphics rendering its self, its older than Nvidia, what Nvidia did was do a lot of development work to make it work in real time on consumer grade GPU's, and it is very very clever, take absolutely nothing away from them they deserve a lot of kudos for it, by the same token you also have to hand it to AMD and Intel for developing it for their own architectures in just a couple of years, its not small feat for Nvidia to do what they did and its no small feat for AMD and Intel to follow up really very quickly.

I also think AMD deserve recognition for modern API's, DX12 would not exist if not for the work AMD put in, Vulkan, which is better certainly wouldn't.
You want to know just how good Vulkan is look at Doom Eternal with the RT cranked right up, unlike DX12 it doesn't eat 60% of your performance, Vulkan like Mantle has some very clever coding for resource and execution management, its dynamic smart code vs DX very dumb code.

Both AMD and Nvidia, and some no longer with us have all contributed to modern graphics rendering, i just wish Nvidia was less cynical about how they apply their incredible talents.
 
Tessellation was invented by ATI, PhysX by 3DFX, modern Screen Space GI by AMD.

Ray Tracing is as old a graphics rendering its self, its older than Nvidia, what Nvidia did was do a lot of development work to make it work in real time on consumer grade GPU's, and it is very very clever, take absolutely nothing away from them they deserve a lot of kudos for it, by the same token you also have to hand it to AMD and Intel for developing it for their own architectures in just a couple of years, its not small feat for Nvidia to do what they did and its no small feat for AMD and Intel to follow up really very quickly.

I also think AMD deserve recognition for modern API's, DX12 would not exist if not for the work AMD put in, Vulkan, which is better certainly wouldn't.
You want to know just how good Vulkan is look at Doom Eternal with the RT cranked right up, unlike DX12 it doesn't eat 60% of your performance, Vulkan like Mantle has some very clever coding for resource and execution management, its dynamic smart code vs DX very dumb code.

Both AMD and Nvidia, and some no longer with us have all contributed to modern graphics rendering, i just wish Nvidia was less cynical about how they apply their incredible talents.

Yea RT was never really meant for real-time. It was for pre-rendered movie sequences. It's tech that's being forced to do something it's not designed for.

It's still quite noisy when done in real-time, as well as horribly slow.
 
Last edited:
Tessellation was invented by ATI, PhysX by 3DFX, modern Screen Space GI by AMD.

Ray Tracing is as old a graphics rendering its self, its older than Nvidia, what Nvidia did was do a lot of development work to make it work in real time on consumer grade GPU's, and it is very very clever, take absolutely nothing away from them they deserve a lot of kudos for it, by the same token you also have to hand it to AMD and Intel for developing it for their own architectures in just a couple of years, its not small feat for Nvidia to do what they did and its no small feat for AMD and Intel to follow up really very quickly.

I also think AMD deserve recognition for modern API's, DX12 would not exist if not for the work AMD put in, Vulkan, which is better certainly wouldn't.
You want to know just how good Vulkan is look at Doom Eternal with the RT cranked right up, unlike DX12 it doesn't eat 60% of your performance, Vulkan like Mantle has some very clever coding for resource and execution management, its dynamic smart code vs DX very dumb code.

Both AMD and Nvidia, and some no longer with us have all contributed to modern graphics rendering, i just wish Nvidia was less cynical about how they apply their incredible talents.

AMD actually own quite a lot of the patents for CPU and GPUs. They came up with the 64bit architecture we use, which is why you sometimes see it referenced as "AMD64".
 
Last edited:
Yea RT was never really meant for real-time. It was for pre-rendered movie sequences. It's tech that's being forced to do something it's not designed for.

It's still quite noisy when done in real-time, as well as horribly slow.
It wasn't, because you didnt have the power. You have now to do it better than raster.
 
It wasn't, because you didnt have the power. You have now to do it better than raster.
Yes and no.

Top end 408/90's can(ish), but everything under, forget it, too much artifacts/blur/wax when the (extremely good!)RTX software can't process throughput via the small bus/vram limitations Nv enforced on anything under the 4080.:thumbsup
 
Last edited:
Top end 408/90's can(ish), but everything under, forget it..

I think he mentions it earlier in this thread:

Basically only 4080 and 4090 have enough juice to push 4k with some RT. 12GB vRAM for 4k is not really a thing since a card with only that much vRAM it should not be used higher than 1440 (or even 1080p).

If you own a 4080 you can probably get away with it now, generally seems to be the ones that are on these being on the positive side of the discussions.
 
AMD actually own quite a lot of the patents for CPU and GPUs. They came up with the 64bit architecture we use, which is why you sometimes see it referenced as "AMD64".

AMD64 is the proper name for it, AMD agreed to Microsoft and Intel calling it X86_64, some might say so that Intel could save face.

Go to C:\Windows\WinSxS

My Intel Laptop. Windows is an AMD architecture, so is Linux, who call it by its proper name, when you download Ubuntu its labelled AMD64. i386 is no longer available, its dead. Obsolete.

X2h8s1i.png
 
Last edited:
Back
Top Bottom