• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

RT being too heavy on GPUs is a given. It's not poor optimisation or lazy devs, the technology has been available for decades and prior to 2000 series it was impossible to render real time.

I can only refer you to previous posts here by me and other with actual examples, including actual Devs of ue5 and mega lights asking game developers to stop being lazy and follow guidelines to get proper performance with hardware Lumen. So yes, there's a lot of lazy approach in game development that should be addressed to get proper performance. It's not a theory, it's been proven in multiple posts and videos. That is the main issue currently. But Lumen in games isn't PT - 2 different approaches to RT in games.
 
Really depends on what games you’re playing and what you want from your gaming experience.

If you’re someone who wants the best looking game and playing at high resolution then Ray tracing makes a big difference visually in a lot of games
But that's the point shown in the stats - people don't find it worth it. When it's much more affordable to have it running well, nobody will complain. Currently it's far from it. Nvidia's fix for it seems to be FG2 - more ai generated frames (with higher latency - they can rubbish all they want in marketing, it will be higher latency as it delays real frame by 3 ai frames), instead of actual improvements to RT acceleration. Maybe 6k series will bring real improvements, 5k seems to have failed royally here.
 
But that's the point shown in the stats - people don't find it worth it. When it's much more affordable to have it running well, nobody will complain. Currently it's far from it. Nvidia's fix for it seems to be FG2 - more ai generated frames (with higher latency - they can rubbish all they want in marketing, it will be higher latency as it delays real frame by 3 ai frames), instead of actual improvements to RT acceleration. Maybe 6k series will bring real improvements, 5k seems to have failed royally here.
Maybe, but i think you will need to wait for the reviews and testing to see how it actually performs in RT
 
But that's the point shown in the stats - people don't find it worth it. When it's much more affordable to have it running well, nobody will complain. Currently it's far from it. Nvidia's fix for it seems to be FG2 - more ai generated frames (with higher latency - they can rubbish all they want in marketing, it will be higher latency as it delays real frame by 3 ai frames), instead of actual improvements to RT acceleration. Maybe 6k series will bring real improvements, 5k seems to have failed royally here.

I agree - the new 5 series cards leave me feeling cold. The prices slated are utterly beyond what I'd be willing to pay.
 
Maybe, but i think you will need to wait for the reviews and testing to see how it actually performs in RT
There are already indications on NVIDIA's own website (some of the graphs without FG enabled) to extrapolate results, but yes, full independent reviews will be great to see.
 
I can only refer you to previous posts here by me and other with actual examples, including actual Devs of ue5 and mega lights asking game developers to stop being lazy and follow guidelines to get proper performance with hardware Lumen. So yes, there's a lot of lazy approach in game development that should be addressed to get proper performance. It's not a theory, it's been proven in multiple posts and videos. That is the main issue currently. But Lumen in games isn't PT - 2 different approaches to RT in games.

If you're talking about individual games then there will always be lazy devs. But what's ironic is that RT is supposed to save them effort, not add to it, which is also something we've also discussed at length in this thread. Whether its the devs fault, Epics fault for poor documentation/knowledge transfer, or publishers is anyone's guess, but your references are towards UE5 games of which I have personally played 2 - senua and wukong. I've played plenty others not in UE5.

But that's the point shown in the stats - people don't find it worth it.

Let me remind you of the stats on the front page:

SLpojqt.jpeg

By these numbers, only 16% can be definitively categorised as "people who don't find it worth it". The top 3 categories are people who are playing with it on, and the final category is ambiguous - whether those people feel it is not worth it based on perf, or it is unplayable because of perf, cannot be decided so shouldn't be counted in a "not worth it" category.
 
Last edited:
If you're talking about individual games then there will always be lazy devs. But what's ironic is that RT is supposed to save them effort, not add to it, which is also something we've also discussed at length in this thread.

It saves time for artists. It definitely doesn't save time for programmers - they still need to faff around with engine etc. And it never promised saving time for anyone but artists either. However, some of the performance issues are caused by programmers and some are caused by artists who just don't know how it works and do silly things. Also, we need much faster GPUs than any currently existing ones to get to the point of "turn it on and it just works" - because we are nowhere near that performance, developers have to use a bunch of tricks and these require special treatments. Like for example overlapping too many light sources causing them to converge too many rays over same pixels will generate so much noise it will overwhelm denoiser and fixing quality will kill performance - artists have to be aware and not do such things. UE provides tools for them to be able to see exactly what's wrong and fix it but they often don't do it anyway - and that's just lazy.
Whether its the devs fault, Epics fault for poor documentation/knowledge transfer, or publishers is anyone's guess,
It's not - most Devs and publishers seem to be switching to UE5 because it has by far best support. It might not be ideal engine but it's elastic and there's so many guides, documentation and community around it that it's making things much easier (and by that cheaper to develop). Which is also why the main focus of my posts is on UE5 - it's the main engine which is to be used in nearly all AAA games soon.

But your references are towards UE5 games of which I have personally played 2 - senua and wukong. I've played plenty others not in UE5.
I've played quite a few, including few that were total financial flops as nobody bought them, just to see how it works. Not great, in many places - stuttering and lower than expected FPS. Now, Senua is awesome visually - waking simulator yes, but will optimised and just artistically really really well made. These Devs know what they are doing, it's one of my benchmarks of how realistic games should look and visually much better than the new Indiana Jones game (especially faces). Wukong started badly with optimisation but they are working on it actively and it's always been quite scalable, so generally also a well made game for wide array of hardware. Sadly, these are some of the very few actually very good games on UE5 - most other are just awful.

Let me remind you of the stats on the front page:

You have very curious way of reading stats. The last, main stat is clear to me and I don't get how anyone can understand it differently - not worth the cost. If it was cheaper GPUs that could run it, or hugely better visuals than make the game or any other reason that would make it worth it, people would use it. But as it's now, it's not worth it so it's not being used.

Considering most users sit by far on xx60 class GPUs and that never changes, it's not going to change till GPUs running RT properly will be in the xx60 price range. As this is also huge majority of the gaming market, they will simply not buy games that don't run well on their hardware and they will not go for higher tier cards but will switch to consoles instead if forced, or will simply play other games (plenty coming out each month) - as that's been always historically the case. In effect, publishers that do not make good games which also run well on such hardware will suffer financially - which is exactly what we see happening to AAA games for a while now and why Ubisoft is pretty much bankrupt and why studios are getting closed left and right by other publishers, whilst reporting huge losses from such games. Something has to give and it's not going to be the average Joe, it seems. And that's what why I don't believe enthusiasts are large enough market to sustain it and instead everything will be dragged down back to the mainstream as that's the actual source of monies. This is also where AMD is aiming with their coming GPUs (or so they claim).
 
Last edited:
It saves time for artists. It definitely doesn't save time for programmers - they still need to faff around with engine etc. And it never promised saving time for anyone but artists either.

If an artist works on a title in a dev house, does that dev not save time if they do not need to faff around with baking in assets and illumination? I don't recall anyone arguing it saves programmers time.

It's not - most Devs and publishers seem to be switching to UE5 because it has by far best support.

No idea what you're arguing against here

You have very curious way of reading stats. The last, main stat is clear to me and I don't get how anyone can understand it differently - not worth the cost.

Re-read my post, I just explained to you how it could be understood differently. Let's say we live in your black and white world and that it means what you believe - that means 48% actively using RT, and "52%" think it's not worth it. Far cry from the claim that "people don't find it worth it"...
 
If an artist works on a title in a dev house, does that dev not save time if they do not need to faff around with baking in assets and illumination? I don't recall anyone arguing it saves programmers time.
Potentially but often not much. A lot of the times by the time game is being finished and close to release artists are working on DLCs, cosmetics etc. - as they have nothing to do in the main game anymore, for months. So in that sense, it saves their time a bit so they can work more on paid extras for the company to make more money. None of that is any good for the end user though, nor I've seen any evidence it speeding up games' production or reducing cost of producing them.
No idea what you're arguing against here
Just saying it's not the problem with documentation or UE not communicating properly - UE5 is very well documented, with tons of guides and materials. If something's not done right, it's on devs alone.
Re-read my post, I just explained to you how it could be understood differently. Let's say we live in your black and white world and that it means what you believe - that means 48% actively using RT, and "52%" think it's not worth it. Far cry from the claim that "people don't find it worth it"...
Agree to disagree, I don't see it like you do and neither did HUB apparently.
 
Agree to disagree, I don't see it like you do and neither did HUB apparently.

It's not an opinion, the numbers are plain to see. 48% use it, 52% don't. What you can disagree on is the rationale for the 36% of the stats not using it, but I already gave the whole number to you...
 
It's not an opinion, the numbers are plain to see. 48% use it, 52% don't. What you can disagree on is the rationale for the 36% of the stats not using it, but I already gave the whole number to you...

That’s some spin you’ve done there. That’s not how stats work because what you did is an incredibly simplistic analysis. I work in IT where statistics and data are used to evaluate and award multi million pound contracts and work force management. If I were to interpret stats like that I would be laughed out of the room.

It’s like running a survey asking how much alcohol you drink and concluding 80% of adults who have the odd alcoholic beverage at the weekend are technically the same as alcoholics.

The takeaway from that survey would be that a significant majority of the customer base cannot run RT at reasonable settings, or at all due to performance.

As a dev I conclude “we can’t rely on RT to sell games”. So they keep it at just enough to say “we have RT”. This is in fact the current reality of RT gaming.

As a GPU manufacturer I conclude “our push to RT is still years away”.

Only now are we potentially getting £400 - £500 GPUs at “mainstream” that can technically run decent CP2077 levels of RT with upscaling. It will still be next gen before that cost of entry drops to the £300 - £400.
 
Last edited:
That’s some spin you’ve done there. That’s not how stats work because what you did is an incredibly simplistic analysis. I work in IT where statistics and data are used to evaluate and award multi million pound contracts and work force management. If I were to interpret stats like that I would be laughed out of the room.

It’s like running a survey asking how much alcohol you drink and concluding 80% of adults who have the odd alcoholic beverage at the weekend are technically the same as alcoholics.

The takeaway from that survey would be that a significant majority of the customer base cannot run RT at reasonable settings, or at all due to performance.

As a dev I conclude “we can’t rely on RT to sell games”. So they keep it at just enough to say “we have RT”. This is in fact the current reality of RT gaming.

As a GPU manufacturer I conclude “our push to RT is still years away”.

Only now are we potentially getting £400 - £500 GPUs at “mainstream” that can technically run decent CP2077 levels of RT with upscaling. It will still be next gen before that cost of entry drops to the £300 - £400.

Rather than hurf blurf with word salad and self promotion on your role in IT, deal with this problem statistically, put your money where your mouth is and break down each category and tell us how you are able to determine the nuance from whether the person thinks it is worth it or not against the literal meaning of the category.

Show me how it's not:
48% use it (think it's worth it some or all of the time)
16% don't use it ( because they don't think it's worth it)
36% can't use it (either makes perf intolerable and so can't use it but they may still think RT is worth it, or the perf hit is tolerable but not worth it over faster frames)
 
Tell us you don’t understand statistics without telling us you don’t understand statistics.

You are spinning this from the incredibly biased and simplistic perspective of an RT fan. You need to look at it from the perspective of a bean counter who has to look at profits and where to invest resources, time and money. The majority of developers aren’t going to alienate the significant majority of their customer base by targeting super duper levels of RT. They aim for the mainstream and call it a day.

I don’t need to “prove” anything, the proof already shows in how RT is implemented in the majority of modern games. It is an afterthought, a marketing gimmick, done at such mediocre levels because the tech just isn’t there yet at mainstream prices. Nvidia might have 90% of the discrete GPU market but the percentage that can run RT at even half decent levels is tiny. When we combine console and PC gamers, the portion that can run anything close to meaningful RT levels (let alone high) is tiny. That HUB survey supports this fact.

That is reality and no amount of you “spinning” and deliberately misinterpreting statistics will change that. Those statistics will only begin to favour RT when it reaches decent performance levels on mainstream sub £400 GPUs and consoles.

It’s getting there, but as predicted by many when it first launched in 2018, it would take about a decade to become “mainstream”.
 
Last edited:
Enthusiasts on forums need to look at the Steam top 10 of dGPUs. The fastest card is an RTX4060TI,with the desktop RTX3060 and laptop RTX4060 taking the top 2 spots.

Now look at the results of the poll in the OP and the most common popular games over the last couple of years. They are designed to scale down to weaker hardware to maximise sales and these games make billions of USD a year.

It's not rocket science to realise until mainstream cards and consoles can run it fine,no amount of optimisation will help with weak hardware. I mostly use mainstream hardware myself and so do most of my mates - even if we want to run RT,the hardware compounds any issues on the software side.

But most mainstream hardware is barely getting 20% improvements in that area each generation for the last three generations. You need at least 30% to 40% improvements each generation,to see a doubling of performance every 4 to 5 years.
 
Last edited:
Hurf Blurf

Why are you unable to discuss the numbers? As a self-proclaimed stats professional, you can surely dig into the numbers and discuss them at face value and draw correlations between whether users think it's worth it or not. But two posts in and you are reluctant. How you can fling "you dont know" then refuse to show you how do know is on par for your posting history.

Enthusiasts on forums need to look at the Steam top 10 of dGPUs. The fastest card is an RTX4060TI,with the desktop RTX3060 and laptop RTX4060 taking the top 2 spots.

Now look at the results of the poll in the OP and the most common games available. It's not rocket science to realise until mainstream cards and consoles can run it fine,no amount of optimisation will help with weak hardware.

The steam charts are irrelevant in combination with that survey. And mainstream cards can run it, fine is subjective and based on other hardware settings.
 
I have discussed them at face value, you even quoted my post where I outlined a brief explanation of how the stats would be interpreted by game developers. The fact you can’t see that is not my problem. I tried telling you why your analysis of the survey was simplistic. I then went in to some details as to why you were wrong and even used a deliberately ridiculous analogy (all people who drink are alcoholics) to demonstrate this fallacy.
 
Last edited:
There is no need of statistics.
How many sub $300 GPUs can run RT without the monitor becoming a blurry or pixelated mess due to excessive upscaling needed?

Answer that and you will get the realistic state or RT adoption.
 
RT was never going to run on low end hardware, the absolute minimum is PS5 level GPU class, and in today's money, what was £300 is not £300.
 
There is no need of statistics.
How many sub $300 GPUs can run RT without the monitor becoming a blurry or pixelated mess due to excessive upscaling needed?

Answer that and you will get the realistic state or RT adoption.

Well that is exactly the point, but I was showing how the HUB survey results supports this concept.

In very simple terms, game developers see that 85% of potential PC game customers cannot run RT at significant levels. Add the significant majority of gamers on last gen let alone current gen consoles and the return of investment into high levels of RT becomes minuscule.

As a result developers currently see higher levels of RT as not worth the investment. So we end up with the current reality of where RT is.
 
Last edited:
Back
Top Bottom