• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Quite an anti DLSS sentiment here… even if you think DLSS makes it look worse, the image quality issues seem quite minor to the improvements RT brings to the tables.

So, yes I always turn it on, and always turn on DLSS.
Not at all, I'll use DLSS if required. Upscaling is fantastic, but even with this ray tracing is punishing on non-top end hardware.
 
It does look good but if it’s a choice of ray tracing at 60fps vs no ray tracing at 120fps I’ll take the extra frames in most game genres
 
Nvidia pushes a very intensive graphics feature,then implements one of the worse upsells in the last decade(if not the worst) with the RTX4000 series. Then AMD joins in for good measure. At least with Fermi,Nvidia pushed tessellation but the GTX460/GTX560 series were decent mainstream cards.

Trash like the RTX4060TI is not a good way to bring RT to the masses. It's not even a good way to bring extra rasterised performance to the masses.

If it was the Nvidia of over a decade ago,we would have had the RTX4070 as an RTX4060TI or even an RTX4060. Not anymore,so most of us plebs are stuck with a rubbish experience unless we throw money at it. I don't care enough to throw more money at RT currently.

It might change when we get decent cards at the mainstream level at an actual maintream price. Not the fantasy mainstream prices their accountants thing they are at.

Don't disagree but we don't just have nvidia to thank for this but also amd for not making it a priority to push their RT to compete with nvidia, as usual, if we had competition, things would be in a better state. Nvidia are so far ahead of amd in the RT and feature set department, they can price substantially higher and amd unfortunately think they can do the same but come in maybe 10-15% cheaper for a lesser package overall. Thankfully intel are doing pretty well with their RT performance and solutions, which I hope will mean amd taking note so as to avoid being over taken but alas, amds focus is the console market (which is probably safer/better from their POV)

Again, this is where nvidia have done incredibly well by making people think RT is a nvidia thing....

Ray tracing will be good, provided that developers use it correctly and the hardware required to run it well isn't prohibitively expensive. At this stage, neither of these things are true. If it's an option to turn on, I will turn it on, but if it is impacting performance as much as it currently does, it's the first thing I turn off.

I said a couple of years ago that for me, it's a few generations away, and I still think that it is the case. Diminishing returns on GPUs are also extending that time frame.

This is basically it for me.

We have plenty of games where RT is used and it runs well on entry level hardware, even on consoles, we get 60 fps i.e. metro ee but yes the primary reason is the devs need to do it well, this will only happen more as raster gets phased out, devs become accustomed to RT workflows/implementation, which will naturally happen with more of the market moving to RT capable hardware and raster only hardware being no longer supported.
 
This thread triggered a deja vu of discussions from the time of add-in physX cards

A handful of games used hardware accelerated physics, and it did look nice but essentially was a gimmick

RT is not quite the same, but still it is only used full time in a handful of games and it is not available to 90% gamers
 
The 2 main hurdles for RT advancement is the weak console hardware and devs still supporting raster, I think the biggest step in RT progress will come with the next gen consoles where next gen consoles will have better RT hardware support and probably use hardware mode for RT and the current gen consoles will fall back to more software based RT. IIRC, on steam, isn't it something like 70/80% of hardware has support for RT now?

Consoles are to be a silent entertainment appliance, which cost as little as possible, to reach as wide market as possible. It is a surprisingly powerful hardware for what it is already - they always are. Do NOT expect it to have a powerful RT acceleration etc. anytime soon - that would increase cost, power use and generate noise. All of that means it will simply not sell in a wide market, which is why it will just not happen. At least not till GPUs in prices of up to £200 and with low power use (below 200W) can run RT comfortably (at least with upscaling) - which, again, is not going to happen in the next decade most likely.

Similar situation is with mobile - and mobile is the main gaming market out there, with consoles and PC together being far behind. There is a reason the best selling console with the most games sold on it is still Switch and not XB/PS. Majority of gamers seem to care for the actual gameplay, not graphics (and defo not RT) - which is where the most of modern AAA games on a PC fall short, these days. For most people it is not a hobby, it's just entertainment, which should be accessible and cheap - and that is all they care about. RT simply doesn't match with that, yet.

Also, the Steam stats - 2060 GPU technically support RT. Is it playable in new games? Hell no! Anything below 3080 is pretty much too weak to run RT sensibly well in new games. As HU said in the video, cards below that have to run DLSS already just for raster in many cases, so adding RT on top just kills performance and DLSS doesn't help anymore with that. RT is clearly aimed at hobbyists with deep wallets currently, which is far from being mainstream.

I also think a large part of the hate/anti RT is also because nvidia have done a rather clever game with all their RTX marketing to make people think RT is a nvidia thing and naturally, when nvidia are first to something or leading the way, it gets hated on or rather it's not an important feature/advantage to have but then when/if amd catch up or overtake, it's the next best thing and vice versa to when amd are first and nvidia are catching up e.g. just look at the upscaling and frame gen situation.

It gets hated because it's way too expensive to get into it and people are being "attacked" by NVIDIA's adverts about it all the time. They can't have it, so instead they hate it - it's just NVIDIA marketing backfiring, as it's aimed at wealthy gamers, not mainstream, so it just annoys mainstream people. That's all there is to it really, IMHO. AMD has exactly same pricing issues but doesn't rub RT into every person's eyes all the time, yet, so gets less flak about it.

Ultimately I think people just need to accept that raster is on its way out, majority of games coming out (since RT became a thing in the gaming scene) is only growing and growing, most game engines have RT built in now, UE 5 (which a lot of games are using going forward) uses software RT with support for hardware RT, console games are using RT and some are providing no option to disable it, amd, intel are all onboard with it too not to mention, chipset makers too.

It's really not on its way out and won't be for many many years to come. Few AAA games introducing RT for top few % or so of rich gamers doesn't make raster go away anywhere. You're falling for NVIDIA's marketing again. :) I'd say 95%+ of games coming out these days are still full raster, with no sign of that changing. Loud releases of AAA games are just a miniscule part of the whole gaiming market. It will happen only when consoles and GPUs can run it well and cheap, with low power use - currently it's simply a gimmick for such hardware. Out of all games I play these days only The Riftbreaker and CP2077 have RT, all the rest either have none or it's a small optional (and largely irrelevant) addon in them. And most of these are new games from 2023. But I like to play games with good gameplay (as in, fun games), not live services abominations or bad stuff like Starfield (which is sadly most AAA games these days - looks over substance).
 
Raster graphics will always be required i'm sorry to say. A ray tracer or path tracer does not create models, geometry and materials, it reacts to them. It's a final render path so to speak. You will still need fast raster graphics cards to display the models, geometry and materials quickly so the ray tracing part can react to them.

I'll give another example, Toy Story 1, a relatively simple (not ray traced by the way) CGI film took between 20-30 minutes to render a frame back in the days it was originally made. Using todays technology and using ray tracing they rendered the same frames and it still took 15 seconds per frame to render on much more powerful hardware than even a 4090. Now that is an amazing reduction, no doubt about it but to get similar graphics in to a game that needs minimum 30 FPS, sorry we're not there yet.
 
Last edited:
Surely your nausea was caused by low fps and not ray tracing itself..?
I assume that is what he meant - RT on, means low FPS and possibly DLSS FG with added latency. I had moments I had to stop playing CP2077 with PT and FG enabled as it just causing same for me after a while. And that's on 4090 - it's fast action game (especially with melee spec) and any latency added is easy to feel. I reckon if I ever play it again, I rather get rid of PT and just have better FPS and lower latency instead.
 
For me, the performance hit is such that raytracing is unsuitable for fast-paced games. But for more sedate games - Portal RTX, for instance - it makes a huge difference in visual quality.

When RT first came in with the 20 series I remember opining that it would take at least 3 generations to hit the mainstream. That would be the 40 series, but with the price hikes and numbering shenanigans we've seen, I'm putting it back a generation or two.
Bold of you to assume NVIDIA will make it cheaper next gen. ;)
 
What's your take on the chipset makers, intel and amd supporting RT and working towards improving their versions? Thoughts on spiderman 2 and avatar (nvidia not involved at all) not providing any option to disable RT?
Avatar sold very well it seems, but I put that on connection with popular film, whilst not looking much worse. However, looking at reviews scores by gamers - it's a very average game. Which, again, is because of devs focusing on looks over gameplay. If not for the Avatar branding I suspect it wouldn't sell nearly as well as it did, RT or not.
 
That's like saying we're happy to stick with GTA SA/vice city level of visuals since you don't stop to think if graphics look good or realistic.... Like any advancement in graphics regardless of RT, it's all about providing extra immersion or improving the overall visuals to enjoy as you "play" the game when you're in the moment e.g.

- driving down night city when it's raining is quite an awesome immersive experience regardless of RT being on but with it being on, it's taken to a whole new level that simply wouldn't be possible even with the very best raster implementation (and cp 2077 raster methods are actually very well done too)

For me, one of the main perks of RT is just simply not having the immersion breaking artifacts of raster where reflections disappear, weird halo'ing around objects in front of water surfaces etc. I somewhat compare it to 144hz vs 60hz, in some cases, it's not immediately noticeable but it's when you have become used to it and you go back to the old way, that's when you notice something doesn't look/feel right:
That's all well and good, but I shall point again at the fact that the best selling console with the most sold games is still Switch and not anything better. Majority of gamers seem to care very little for graphics, because it's an entertainment. Which means, cheap, accessible, fun. That's it. In all the games I ever played, I remember good ones only by gameplay, story, fun - never by graphics in them (and some looked really awesome when they came out). Even CP2077 with PT, each time I go back to some events in the game in my mind, it's always story-related and never ever graphics related. It's just fluff - nice to have, really not important to have if game is good. Sadly, as I say often, most new AAA games have fine graphics but really aren't very fun to play to me, and by that I don't even remember them soon after I ditch them.
 
Last edited:
Quite an anti DLSS sentiment here… even if you think DLSS makes it look worse, the image quality issues seem quite minor to the improvements RT brings to the tables.

So, yes I always turn it on, and always turn on DLSS.
With modern games most cheaper GPUs (the most popular ones) already have to turn on DLSS to run raster. They have no more DLSS to enable to run RT on top of that. It's not hate about DLSS, it's just a fact that DLSS doesn't help with RT anymore in such cases.
 
I'm not completing any survey where I have to identify myself (to Google).

In the games I play, I don't see any difference with RT, so I usually leave it off. I am sure if I spent time examining the screen I might see some difference, but overall I am not impressed with it.

DLSS is different. I often turn that on if I can't achieve the performance I want.
 
Last edited:
Bold of you to assume NVIDIA will make it cheaper next gen. ;)

Yes, well the RTX 4060 should really be called the RTX 4050 and they're charging xx60 ti prices for it. For RT to hit the mainstream it needs to be decently playable on a xx50 class card with xx50 class pricing. Right now that should be under £200. Will that come with the 50 series? Or will we have to wait for the 60 series to push second-hand prices down? Much, I think depends upon Intel's Battlemage. Leaks (or hype) suggest they are targeting performance exceeding the RTX 4070 for the flagship B770.
 
Consoles are to be a silent entertainment appliance, which cost as little as possible, to reach as wide market as possible. It is a surprisingly powerful hardware for what it is already - they always are. Do NOT expect it to have a powerful RT acceleration etc. anytime soon - that would increase cost, power use and generate noise. All of that means it will simply not sell in a wide market, which is why it will just not happen. At least not till GPUs in prices of up to £200 and with low power use (below 200W) can run RT comfortably (at least with upscaling) - which, again, is not going to happen in the next decade most likely.

Similar situation is with mobile - and mobile is the main gaming market out there, with consoles and PC together being far behind. There is a reason the best selling console with the most games sold on it is still Switch and not XB/PS. Majority of gamers seem to care for the actual gameplay, not graphics (and defo not RT) - which is where the most of modern AAA games on a PC fall short, these days. For most people it is not a hobby, it's just entertainment, which should be accessible and cheap - and that is all they care about. RT simply doesn't match with that, yet.

Also, the Steam stats - 2060 GPU technically support RT. Is it playable in new games? Hell no! Anything below 3080 is pretty much too weak to run RT sensibly well in new games. As HU said in the video, cards below that have to run DLSS already just for raster in many cases, so adding RT on top just kills performance and DLSS doesn't help anymore with that. RT is clearly aimed at hobbyists with deep wallets currently, which is far from being mainstream.



It gets hated because it's way too expensive to get into it and people are being "attacked" by NVIDIA's adverts about it all the time. They can't have it, so instead they hate it - it's just NVIDIA marketing backfiring, as it's aimed at wealthy gamers, not mainstream, so it just annoys mainstream people. That's all there is to it really, IMHO. AMD has exactly same pricing issues but doesn't rub RT into every person's eyes all the time, yet, so gets less flak about it.



It's really not on its way out and won't be for many many years to come. Few AAA games introducing RT for top few % or so of rich gamers doesn't make raster go away anywhere. You're falling for NVIDIA's marketing again. :) I'd say 95%+ of games coming out these days are still full raster, with no sign of that changing. Loud releases of AAA games are just a miniscule part of the whole gaiming market. It will happen only when consoles and GPUs can run it well and cheap, with low power use - currently it's simply a gimmick for such hardware. Out of all games I play these days only The Riftbreaker and CP2077 have RT, all the rest either have none or it's a small optional (and largely irrelevant) addon in them. And most of these are new games from 2023. But I like to play games with good gameplay (as in, fun games), not live services abominations or bad stuff like Starfield (which is sadly most AAA games these days - looks over substance).

Of course hardware won't ever cut it alone, at least not until we get to the next stage of hardware, whatever form that may take, moores law is dead is true to an extent. It's all about finding more efficient methods whilst companies figure out the next big thing i.e. partly why we have upscaling, frame gen and no doubt probably some other new tech to increase FPS and more efficiently handle workloads such as RT but alas, if console manufacturers stick with amd, we probably won't see real breakthrough here given ps 5 and xbox still have **** all games using FSR let alone FSR 3/FG (although wouldn't be good if base fps is 30 as is the case for a lot of games).

RT and other visual advancements should not be tied to games quality either, those are completely different things, you can have both regardless of if a game is great or not.

2060 etc.? Is it playable on poorly done RT games? Nope.... How's it handle games where RT is done very well? Pretty damn well e.g. see metro ee again, anyone who expects to be able to play likes of cp 2077 and aw 2 on a 5 year old mid range gpu is delusional.

No one is holding a gun to peoples heads to enable it so if people are buying into it and not liking it, well, that's their own fault and if people don't want RT etc. well, amd to the rescue with their cheaper gpus, right?

It's on the path to becoming a deprecated feature, it won't happen any time soon but that's the ultimate ending for it. The fact that we have UE 5 becoming far more common (which uses software RT) and other games/engines going the way of only using RT shows what the fate of raster is.

Nvidia marketing? On what? Again, why do you think RT is a nvidia thing? What about all the industries getting on the RT bandwagon? What about games/non sponsored titles getting RT treatment?

95%? Did you even look at nvidias list? Or are you still factoring in games from years ago when RT wasn't a thing or just new? It's easier to list games nowadays which don't have RT. If you don't play games with RT, that's fine but to suggest that there are little to no RT games just because you aren't playing them doesn't mean there aren't a good chunk of RT games out there now.

Using terms like gimmick just further reinforces this point:

Not to be that guy..... but the biggest issue is there is a massive lack of knowledge and awareness on what RT actually is and what it sets out to achieve as well as who really benefits from it as well as just being outright oblivious to the advantages it offers over dated methods. The other problem is people who just look at something like BF 5, tomb raider and go "rt sucks!!!" and ignore every other game.
 
Avatar sold very well it seems, but I put that on connection with popular film, whilst not looking much worse. However, looking at reviews scores by gamers - it's a very average game. Which, again, is because of devs focusing on looks over gameplay. If not for the Avatar branding I suspect it wouldn't sell nearly as well as it did, RT or not.

It was more in reference to him making out that RT is only in nvidias best interest:

Nvidia simply slapped a feature and raised prices sold at twice the price and people thought it got better.

Not specifically game wise i.e. I used avatar and spiderman as examples because nvidia are not involved here so it's not just "nvidia slapping a feature on"
 
That's all well and good, but I shall point again at the fact that the best selling console with the most sold games is still Switch and not anything better. Majority of gamers seem to care very little for graphics, because it's an entertainment. Which means, cheap, accessible, fun. That's it. In all the games I ever played, I remember good ones only by gameplay, story, fun - never by graphics in them (and some looked really awesome when they came out). Even CP2077 with PT, each time I go back to some events in the game in my mind, it's always story-related and never ever graphics related. It's just fluff - nice to have, really not important to have if game is good. Sadly, as I say often, most new AAA games have fine graphics but really aren't very fun to play to me, and by that I don't even remember them soon after I ditch them.

Again, RT and good gameplay are 2 completely different things. Visuals/graphics are there to aid in the enjoyment e.g. palworld, cartoony game but one of the most fun games I've played recently, however, using RT in it, the visuals are enhanced thus my enjoyment is improved further.

Same way I love likes of valheim and 7 days to die despite their graphics being pretty rubbish but if it could get RT enhancement (done well to enhance the core art/visuals), it probably increase the enjoyment.

It's the same with HDR too btw. If a game doesn't support hdr, it's not a big problem but having a true hdr experience vastly improves the experience.
 
Last edited:
Back
Top Bottom