• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Software Lumen is gash, sure it might mean more people can use it and it is better than legacy lighting systems but it isn't a patch on a proper PT implementation.
 
Software Lumen is gash, sure it might mean more people can use it and it is better than legacy lighting systems but it isn't a patch on a proper PT implementation.
Oh course if you compare it to the hardware one or PT, it offers worse quality. There are no miracles, they can't beat thermodynamics etc. I don't know why one would even expect it to be the range whilst using much less power and much simplified methods to create image. The question isn't if it gives one same quality, but if it's actually usable by most gamers and better than previous methods (e.g. full raster, with SSR etc.). It seems to me that it is and as such, it's still progress. Enthusiasts might not like it but this is mass market solution, whereas PT and the likes are niche and will be niche till mass market can run it without thinking - so likely for many years to come.
 
Last edited:
"Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled." - from your own link, directly supporting what I just wrote. Most GPUs are of the xx60 class, as per Steam (and many other) stats. They either do not support hardware Lumen at all, or it's way too slow on them. Ergo, they can't use it, ergo it's not better for them. This whole RT vs no-RT discussion really goes down to - can majority of people use it? Answer is - no. They can't, their GPUs can't handle it or handle it poorly. And with ever-growing prices of GPUs, chances are they won't be able to use it for a while longer.

This.

You get a lot more light bounce by using Lumen for GI, its a very different effect than stand alone RT because due to light bounce limitations you're not actually getting anything like as much ambient colour bounce, yes you can manually adjust how many rays you bounce and how much to get better results, however to get the same effect admittedly to a higher standard you can easily turn a 4090 in to a slide card projector.

Lumen cheats by simply reading surrounding colour maps and then using a render to texture technique projecting that where needed, its still using light bounces to calculate that but it can do it with far less work. The down side with Lumen GI is if you have a lot of noisy detail you get a lot of noise.

You can use it for reflections too but its not much better than screen space reflections.

 
Last edited:
Honestly i think sometimes people think they have to have RT because with it things look "proper" I think for some people its ruined gaming because they have fallen for the marketing so hard that they can't appreciate the graphics in a game unless it running 'RTX Ti Super' graphics settings.

While Ray Tracing represents 'more' accurate lighting it is only a vague approximation and more often than not used to enhance visual appeal, that's very different to visual quality, a lighting expert might look at Cyberpunk and laugh saying light doesn't behave in this way, not even close, RT is graphics embellishment, like any other.
 
Last edited:
all RT techniques use pre-calculated BRDF volumes, dont need to unnecessarily bounce light rays if you have good BRDF coverage, its a key nvidia innovation, lumen must be using the same as well
 
Last edited:
True ^^^

I can't believe how old this is now but this is what lighting can look like without RT when a top draw lighting artist puts their back in to it.

Kink of a shame in a way given they are now also starting to bring Ray Tracing in to it, i hope with it they don't fire their expensive lighting artists in favour of a button monkey. I'm all for it, i've seen the changes it makes for this game and agree with a lot of it, but you still need good lighting artists.

 
Last edited:
Honestly i think sometimes people think they have to have RT because with it things look "proper" I think for some people its ruined gaming because they have fallen for the marketing so hard that they can't appreciate the graphics in a game unless it running 'RTX Ti Super' graphics settings.

While Ray Tracing represents 'more' accurate lighting it is only a vague approximation and more often than not used to enhance visual appeal, that's very different to visual quality, a lighting expert might look at Cyberpunk and laugh saying light doesn't behave in this way, not even close, RT is graphics embellishment, like any other.
I'll keep saying what I said since the 20th century: Give me denser voxels and a completely interactive world.
Gameplay matters more than looks and that's why Baldur's Gate 3 won the most awards and has been waited for more than 20 years instead of the arguably graphics king of the same year as the first Baldur's Gate, Incoming.
 
Last edited:
Kink of a shame in a way given they are now also starting to bring Ray Tracing in to it, i hope with it they don't fire their expensive lighting artists in favour of a button monkey. I'm all for it, i've seen the changes it makes for this game and agree with a lot of it, but you still need good lighting artists.

theres no need for human input for computing brdf volumes, these volumes are computed offline using traditional forward or backward ray propagation, these volumes can be interpreted as a sum total of all ray bounces at that point, whenever a ray hits a brdf volume it can use values embedded in these volumes as a proxy for actually what would have been if the ray would have been allowed to bounce infinitely during run time, nvidia has extensively worked on the data structure for improving productivity and driving down game development costs
 
Last edited:
theres no need for human input for computing brdf volumes, these volumes are computed offline using traditional forward or backward ray propagation, these volumes can be interpreted as a sum total of all ray bounces at that point, whenever a ray hits a brdf volume it can use values embedded in these volumes as a proxy for actually what would have been if the ray would have been allowed to bounce infinitely during run time, nvidia has extensively worked on the data structure for improving productivity and driving down game development costs

Sure but CD Projekt Red didn't just create a scene and then press a button, the lighting in Cyberpunk doesn't look as good as it does without someone designing it.
 
Last edited:
Up until recently I had no interest in ray tracing, but I'm currently enjoying its implementation into Metro Exodus enhance edition.

Simple, adds to the atmosphere of the game and doesn't kill performance.
Yeah, I've just started playing it and the RT is really well done. Got it set to extreme visuals and looks really nice. Decent performance too.
 
Yeah, I've just started playing it and the RT is really well done. Got it set to extreme visuals and looks really nice. Decent performance too.

Same, ran like crap on my 2070S, now i can run it maxed out with good frame rates i am playing it, self report there on what i was saying earlier :D

Same with Cyberpunk, i'll wait until i have a GPU powerful enough that i can enjoy that with all its RTness, haven't even bought it yet.
 
Last edited:
Most GPUs are of the xx60 class, as per Steam (and many other) stats.
arent steam stats mostly from chinese internet cafes etc now.

I bet the steam generalized hardware stats are very different compared to the steam hardware stats of people who bought games like dying light2, cyberpunk, stalker etc.
 
Last edited:
But that's not how it works in reality, as in that is not how human brain and eyes work. Unless you stop everything and stare at the thing carefully, you will never notice it as a human being - that's just not how our eyes and brains work. If you look at scientific papers, in very simplified words, we only really see well changes in contrasts and movement, we don't really perceive details on most of things that are in our view, only on tiny fraction we currently focus on. We also don't really see colours in the darkness, just contrast, with some details (missing many) - because of how our eyes are constructed. SOME very enthusiastic gamers are able to notice these things, as they trained themselves to focus on such to pretty much expert level - average person never would notice it. Like I can see slightest stuttering in games. Then again, I've seen blind tests of shadows in games like CP2077, with max raster settings vs RT ones. Most people, even experienced gamers, weren't able to tell the difference or which ones are which. Because they look near identical in most cases.

What is really easy to notice though, is crawling noise in darker areas, as it's moving and it's a very visible. Our brains perceive it well, sadly. Also, lagging lights, where light changes, but GI takes even few seconds to fully catch up with it. Artefacts like that are only visible with current RT and PT implementations in games - it's contrasty, it's moving and actually makes (IMHO) image look often worse than just turning off RT/PT. This is where mixing techs do wonders. AI helps here, but most games do not use it and even with it it's not perfect. Standard denoising makes image fuzzy, with reflections lacking details etc. None of that seem realistic to me. We've seen all these issues in CP2077 and many other games too. We're simply not there yet with tech and computational power. One way or another, you will have to live with some compromises.


Again, if you go to the restaurant, do you ask for chef to show you how things are done exactly, or are you just paying for the meal and expect it to be good?


This is not relevant to my argument at all.

Really? A random post on X with over 10k views and whole 56 likes seems to be an evidence of people not caring more than caring, I would say. CP's response of no plans to do it got considerably more likes there. In the end, it's just X - hardly relevant to anything gaming related.

And yet millions more people game on both than on consoles or PCs combined. And all the adverts about introducing RT to mobile do not seem to resonate with general public at all. And why would they care how it's made or what tech is used to make such games? Just on a side note, there's a reason Hollywood is constantly lying by claiming they don't use any CGI (ergo, RT/PT) in these or those effects in films, as this seems to be more of a stigma than a pro.

I don't even understand how you connected these 2 things. There are almost always refreshes of products (not just consoles) after a while, to sell more units by advertising it as new and better (even if it's not really). It's not about graphics at all, it's about monies. Even if new version actually removes things like ability to use physical games.

The number of modern games that would get 10 from me is miniscule, irrelevant of graphics - huge majority is lacking in so many places, they could have PT from the future and I'd still treat them at best as tech demo, then. Gameplay first, art (both visual and sound) second, then graphics far behind, to me - nice to have but definitely not making a game good.

If the product is bad itself, you improve it from 2 to 3 with graphics, but it's still a bad product. That's why it comes last on my list - make a good game first, good story, gameplay. Then have good art and then you can add graphical fluff on top, like a paint on already well built house, with good foundations. That said, just like the paint, graphics is subjective - what looks good for one, the other one won't like, etc. - I've seen in person people who consider games with fancy reflections and shadows etc. to be unplayable as too much happens on the screen. Such people prefer to play on low details. This is common in online games like Warthunder and shooters etc. - cut down graphical fluff, so people can gain competitive advantage etc.

That's exactly when you actually engage creativity, when you encounter limits. That's when best art is created too. As it always was the case in the history of humanity. When things get easy, people become lazy.

Ergo, you don't care about the art, you just want to commission something exactly to your specification. You will get a product, but you won't get art with creativity, in such case. Also, that's not how gaming industry works, unless you're filthy rich. :)

I'm aware how the eyes work, even more so since one of my own is weaker. I stand by my point, all comes together to give the final image and as the sight moves across the screen and you move your mouse around, the focus can fall upon less obvious detail. Plus, at times you do explore, not just run and gun.

If it's a quick bite, fast food style, I don't care more much other than the food to be decent - aka phone gaming, but for something more, I care about the location, how it looks, people serving, etc.

The point was that if graphics don't matter, then devs wouldn't waste time have graphics above what swtich or phones do. They do matter and even console people argue amongst themselves which one does it better.

10 was an example, could anything. Point being, graphics, just like other elements, does elevate the experience for me. I figure it does for a lot from the bunch of "I don't care about graphics", is just that it does up to a certain point which usually means that it runs decent for what hardware they have.

You can't have a Dacia run like a BMW, AUDI or Ferrari.

You also don't print on regular cheap, recycled paper your photos, you use photographic paper for a reason.

Anyway, bottom line, if graphics won't matter than people wouldn't lose time here or on graphics card forums. They'll just buy the cheapest, set all to low and lowest res available and game... which doesn't happen.

arent steam stats mostly from chinese internet cafes etc now.

I bet the steam generalized hardware stats are very different compared to the steam hardware stats of people who bought games like dying light2, cyberpunk, stalker etc.

It doesn't matter either way. Stalker 2 is a killer for 8GB cards and yet... here it is. It didn't asked Steam and didn't care. Good on it, more power to that team!
But is a killer only if you max out settings. Since graphics don't matter, people should just lower settings to its lowest and enjoy. It shouldn't go over 8gb from what I've played - at 1080p at least.
 
Last edited:
arent steam stats mostly from chinese internet cafes etc now.

No. I get them now almost monthly and I'm not in Chinese internet cafe. :)

I bet the steam generalized hardware stats are very different compared to the steam hardware stats of people who bought games like dying light2, cyberpunk, stalker etc.

And you would be wrong - anecdotal evidence granted, but few people from my family have 1060, 1660 and 3060 respectively and all of them bought cp2077 and dl2. It would seem in my whole family I'm the only person who has high end GPU. Same in case of my co-workers (all IT professionals). Most people don't spend that much on gaming. High end gaming GPUs, as per various sale stats in addition to Steam, are a tiny minority of all GPUs sold - that's not news, it's always been like that (even AMD recently claim it's about 1% only if I recall correctly), but it got worse since prices went up considerably. Hell, a lot of people seem to keep on to 1080Ti, as it all works okay-ish (vram helps). :)
 
Last edited:
No. I get them now almost monthly and I'm not in Chinese internet cafe. :)



And you would be wrong - anecdotal evidence granted, but few people from my family have 1060, 1660 and 3060 respectively and all of them bought cp2077 and dl2. It would seem in my whole family I'm the only person who has high end GPU. Same in case of my co-workers (all IT professionals). Most people don't spend that much on gaming. High end gaming GPUs, as per various sale stats in addition to Steam, are a tiny minority of all GPUs sold - that's not news, it's always been like that (even AMD recently claim it's about 1% only if I recall correctly), but it got worse since prices went up considerably. Hell, a lot of people seem to keep on to 1080Ti, as it all works okay-ish (vram helps). :)
Over half of RTX4060 cards on Steam are in laptops - that makes it the second most popular dGPU after the desktop RTX3060 and almost half of dGPU sales are in laptops.

It appears quite a lot of PC gamers now use laptops.
 
Last edited:
Over half of RTX4060 cards on Steam are in laptops - that makes it the second most popular dGPU after the desktop RTX3060 and almost half of dGPU sales are in laptops.

It appears quite a lot of PC gamers now use laptops.

Tried this on a few occasions. It just isn't the same. If I ever quit desktop gaming, at least on the bleeding edge, I will just use hand helds like the Steamdeck.
 
Last edited:
Tried this on a few occasions. It just isn't the same. If I ever quit desktop gaming, at least on the bleeding edge, I will just use hand helds like the Steamdeck.
My wife loves to play on the laptop, though she always streams games to 4k TV and uses controller. She also loves Switch, though. I couldn't drag her with horses in front of a PC, just not her thing - desk is for work, sofa is for entertainment.
 
  • Like
Reactions: TNA
This whole RT vs no-RT discussion really goes down to - can majority of people use it? Answer is - no. They can't, their GPUs can't handle it or handle it poorly. And with ever-growing prices of GPUs, chances are they won't be able to use it for a while longer.
This all day long, and my point still stands-Nvidia has been watering down the gpu stack almost gen on gen for ~decade, and some high end users on the subject of RT blame AMD users for RT negativity(despite having a tiny share of the market) are blind to the fact that the majority of Nv users turn it off as it's not doable.

Case in point, an 8Gb 4060 cant run Stalker 2@1440p+Dlss because it runs out of vram=Nv can't even run software RT'ing on a $300 dollar gpu!

8gb isn't enough, ~'but what do you expect, it's a 4060!', well no the 3070/70Ti also have 8Gb that can't run software RT'ing, how much was the street price on the 70's?

Most users in here paid £8/900+ for 70/ti as the FE's were next to never on the 5 min countdown sales.

Good watch on a whole range of gpus here and (earlier in the vid, it shows the pitfalls of lumen(remember- it saves devs time folks) where raster would have been a better choice to run both options to facilitate low end gpus imo.

HUB shows 8gb problems too.

Unless Nv go 16Gb minimum from the 60/70 and at least 20Gb for the 80, RT'ing is going to be a long long way away for mainstream, good luck with the heavy Nv funded titles guys, maybe they'll bring out one or two more PT titles, if your lucky some more to showcase the 5090 running RTX in all it's glory on 4K@60fps.:thumbsup

Vram does matter-moreso for HWRT, it is extremely cheap considering the fleecing Nv's been getting away with despite always been the first to fall, if Nv's Vram truly was 'plenty', Vram wouldn't get talked about-at all!
 
Last edited:
Back
Top Bottom