• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

WOW was and still is to my knowledge regarded as being one of the worst games for RT implementation, I only recall of the shadows being implemented but maybe more was added later on? I never played it so can't comment on it.

They didn't add more, but it's really horrible. It's the definition of "bolted on top" RT effect, where none of the visible in scene light sources match what it actually calculates and so shadows make no sense at all. This also happens with the raster shadows though, so not RT issue as such, just RT reveals it even better.

Been using the mod myself for palworld and it's working just fine, haven't noticed any issues and it has in fact got rid of the artifacts associated with SSR i.e. the halo'ing around objects when in front of pools of water.

Not playing PW myself so I can't really say - only commented on what I've seen in comments under the mod. But not surprising to hear about SSR sucking - I don't like that tech at all, it took away (as I said few times already) my good reflections in games. :P

Shadows are more defined and so on. Unfortunately I have to use TSR since dlss is not available on the gamepass version

I heard GP version is badly outdated, like few patches behind the one in other places - because of certification lags. Like 2 different games, according to comments I've seen.

but it looks better than native taa and no rt (no taa and the grass has awful aliasing/jaggies), fps is about 55-80 depending on location and time of day. The game itself is incredibly buggy though so again, even if there are problems for other people, it's not just a RT specific problem.

That is again what I read about the GP version being so outdated and by that buggy etc. Then again, TAA is another typical culprit of bad IQ in games. :/ I am happy to have discovered recently I can convert all the games I currently play with DLSS to DLAA using just NVIDIA drivers and profile manager. Works flawlessly so far. :)

In your previous posts, you have been insinuating that games with fancy graphics don't do well then used palworld as an example for this, I merely just pointed out that graphics have very little to do with how well a game will do and the 2 should not be compared as they are different things and also worked on by different teams/developers.

As I explained, sadly, in many cases fancy graphics means game is crap, as cost cutting means they focus on one thing and not the other. I blame publishers, cost cutting, rushing games etc. for that - not RT as such. It's more of an observation than blaming RT and I've been saying that for quite a while now (not just in this and the other topic). Effect is, though, that a lot of gamers blame RT for that that too - "gimmick added, so we don't get good game" etc.

It actually depends on the game I find but yes generally 60 fps is the rule before enabling 60 fps. Even though ark and cp 2077 base fps is below 60 on my end, they still look and feel and play far better with FG than no FG, are there some artifacts and is there an increase in latency, absolutely yes but is the overall experience better in terms of actually being playable and having better motion clarity and fluidity? Absolutely it is.

I responded to this in the other topic (about latency). :)

As we also discussed before, it's wrong to assume that the only way to increase RT perf is through better hardware, as shown before, there are ways to improve performance with better optimisation and coming up with ways to cut corners, just read the documentation by nvidia, amd, intel and unreal engine, they have several documents and tools to provide guidance on where and what to do in order to get better performance.

Pretty much all of them are just cheating and simplifications, often lowering IQ and using AI. Good enough for gaming, but would've never been used in professional work for example (unless just as a quick draw) because it's not the proper, full RT. By full I mean with all the effects implemented fully. What we get in games so far (aside the few ones with full PT) is more of a evolution of raster, which in itself - this is often missed in conversations like this - IS based on RT as well, just super simplified to absolutely bare min, and then with loads of shortcuts added on top. I've seen a few talks of 3d engines devs (including Carmack himself) talking about RT and raster and describing on many examples why raster (lighting in it, shadows etc.) is heavily based on the original RT concept. These 2 technologies aren't that different, after all. It's more that we see now removal of the simplification of raster and letting it spread wings again into the more proper RT. Anyway, to get back to where it should be, only faster hardware will help, all the other things are temporary crutches which will be forgotten the moment hardware becomes fast enough again (even if it's some new fancy holo tech with real AI of the future :) ).

Raster had how many years for devs to learn to get the best from it, either by optimising or cutting corners? We're still arguably at the tip of the iceberg for RT and what can be done.

You can't keep supporting outdated tech... Heck, even a ps 5 and xbx is considerably better than a 1060.
Agreed that 1060 needs a kick, but upgrade to what? Everything that gives as muc performance in modern games as 1060 gave back when it was new, is WAY more expensive. I reckon this is where people go for consoles or end up gmaing on mobile, instead. Numbers of 1060s are dropping but it doesn't seem to be reflecting with numbers of new GPUs growing that much.
 
I don't agree with that. I've had played games on release that dropped my jaw in the past, hell even 3D Mark 2001 did with shaders scene - in my mind back then it literally was "Best graphics ever, I can't even imagine it getting any better anymore!" - and yet, each year after it always became better. Point is, graphics quality is a subjective thing for gamers and a lot of the times they simply don't need anything better anymore, hence RT might seem like a gimmick (till it's just there and not even noticeable anymore), as they really don't need it and often can't run it properly. When one treats games as a pure entertainment and not main thing in life, it's largely irrelevant - all they want is a cheap PC and some fun. Enthusiasts expect more, but it's a tiny minority of the market. Personally I love technology and graphics, but as long as it's expensive enthusiast only thing, it might as well be a gimmick for the average gamer with xx60 card (especially not even supporting RTX).

Graphics quality is not subjective, it's objective and measurable. It's how DF are in business. Your appreciation of graphics is what's subjective.

It's just as well we have enthusiasts that stump up money for new hardware, otherwise the market would stagnate. People will be griping about some other expensive advancement in future, while their games are rendered entirely with RT. Just like no one moans about AA or shadows any more.
 
I read it more like, he was making a joke comparison on the price of these two items. AFAIK there is no good/versatile electric car that can be had for £10k new and there are no GPUs that can run RT properly for £300 new.
Ah, if you look at it that way, it makes perfect sense. In both cases it should be way cheaper and with cars it eventually will (new batteries tech that is being tested might be a big improvement in that, as Li is just very expensive and relatively rare). GPUs on the other hand... well, so far NVIDIA seems to be hard bent on the "Too cheap!", even though market pulls the other way (hence Super dropping in price a bit).
 
Graphics quality is not subjective, it's objective and measurable. It's how DF are in business. Your appreciation of graphics is what's subjective.

Which, in effect, means how gamers perceive graphics (nobody aside reviewers sits by the screen with magnifying glass and compares quality etc.) is subjective, isn't it? Not the appreciation but actual perception. For example, my young brother is happy to play many games on min. details and each time I ask him why he doesn't increase the details, he says he likes it this way - even if FPS isn't an issue. To him it looks good, he perceives it as good, even though I see it the opposite way.

It's just as well we have enthusiasts that stump up money for new hardware, otherwise the market would stagnate. People will be griping about some other expensive advancement in future, while their games are rendered entirely with RT. Just like no one moans about AA or shadows any more.
Enthusiasts do not really count that much anymore, it's a very different market comparing to the one where AA slowdowns mattered - as it's easy to see on Steam hardware survey where mid+ GPUs are in miniscule numbers. That is not where the money is. Mainstream dictates progress usually, just NVIDIA (and AMD too) seem to have forgotten that. And well, these days gamers do not dictate anything, AI/Pro/Enterprise markets do, for NVIDIA. We get whatever scraps fall from that high table, it seems. Also, I dare to say AA is much more important in games than RT, especially that it's very easy to see it on 1080p screens (still huge majority of the gaming market) and RT in most games (the way it's done), not so much, as survey from the start of this topic reveal. Make same one about AA and the numbers would be way different, I bet.
 
Last edited:
They didn't add more, but it's really horrible. It's the definition of "bolted on top" RT effect, where none of the visible in scene light sources match what it actually calculates and so shadows make no sense at all. This also happens with the raster shadows though, so not RT issue as such, just RT reveals it even better.



Not playing PW myself so I can't really say - only commented on what I've seen in comments under the mod. But not surprising to hear about SSR sucking - I don't like that tech at all, it took away (as I said few times already) my good reflections in games. :p



I heard GP version is badly outdated, like few patches behind the one in other places - because of certification lags. Like 2 different games, according to comments I've seen.



That is again what I read about the GP version being so outdated and by that buggy etc. Then again, TAA is another typical culprit of bad IQ in games. :/ I am happy to have discovered recently I can convert all the games I currently play with DLSS to DLAA using just NVIDIA drivers and profile manager. Works flawlessly so far. :)



As I explained, sadly, in many cases fancy graphics means game is crap, as cost cutting means they focus on one thing and not the other. I blame publishers, cost cutting, rushing games etc. for that - not RT as such. It's more of an observation than blaming RT and I've been saying that for quite a while now (not just in this and the other topic). Effect is, though, that a lot of gamers blame RT for that that too - "gimmick added, so we don't get good game" etc.



I responded to this in the other topic (about latency). :)



Pretty much all of them are just cheating and simplifications, often lowering IQ and using AI. Good enough for gaming, but would've never been used in professional work for example (unless just as a quick draw) because it's not the proper, full RT. By full I mean with all the effects implemented fully. What we get in games so far (aside the few ones with full PT) is more of a evolution of raster, which in itself - this is often missed in conversations like this - IS based on RT as well, just super simplified to absolutely bare min, and then with loads of shortcuts added on top. I've seen a few talks of 3d engines devs (including Carmack himself) talking about RT and raster and describing on many examples why raster (lighting in it, shadows etc.) is heavily based on the original RT concept. These 2 technologies aren't that different, after all. It's more that we see now removal of the simplification of raster and letting it spread wings again into the more proper RT. Anyway, to get back to where it should be, only faster hardware will help, all the other things are temporary crutches which will be forgotten the moment hardware becomes fast enough again (even if it's some new fancy holo tech with real AI of the future :) ).


Agreed that 1060 needs a kick, but upgrade to what? Everything that gives as muc performance in modern games as 1060 gave back when it was new, is WAY more expensive. I reckon this is where people go for consoles or end up gmaing on mobile, instead. Numbers of 1060s are dropping but it doesn't seem to be reflecting with numbers of new GPUs growing that much.

Yeah it's missing a few features compared to the version on steam because of microsofts certification sign of that is holding it back.

I think it's fair to say we have had a lot of **** games lately regardless of their visuals, last year was actually quite a good year for me and gaming though, sadly, launch day/month issues ruined a lot of them though.

DF video on hitman was very interesting to watch as he discovered that not only was ray tracing being used for reflections but that raster was also still being applied and thus the performance hit was massive, no idea if that ever got resolved but another showcase of devs just doing a bad job rather than the tech itself being to blame.

Sadly the days of getting gpus for the price we used are gone, inflation and so on. I would say a 3060 12gb is a decent buy if you could get one for <£300, I can't imagine anyone pairing such a gpu with a 4k display but more likely 1080p/1440p. Ideally going forward I would say a 4070 super/7800xt at the minimum for 1440p now.
 
Nope, remember RT is the future… the present is irrelevant. ;)
almost no games are going to get released during a 4090 life time that actually needs one.

Might even end up beating the 980ti in the life time the card stays viable for
 
Last edited:
I quite like it now, but I do prefer higher FPS.
What annoys me though is with some games (i'm looking at your Cyberpunk) the RT is simply awesome but the basics are lost e.g. a lot of the textures are utter crap (which can be solved by mods, but you really shouldn't have to resort to modding) and texture pop-in is terrible.

I just hope it doesn't continue in that fashion of being a distraction away from or crutch for, basic IQ.
 
  • Like
Reactions: J.D
I think it's fair to say we have had a lot of **** games lately regardless of their visuals, last year was actually quite a good year for me and gaming though, sadly, launch day/month issues ruined a lot of them though.

Sadly, that is correct. Everything feels rushed and unfinished, buggy and just... not worth the prices they increase on games all the time.

Sadly the days of getting gpus for the price we used are gone, inflation and so on. I would say a 3060 12gb is a decent buy if you could get one for <£300, I can't imagine anyone pairing such a gpu with a 4k display but more likely 1080p/1440p. Ideally going forward I would say a 4070 super/7800xt at the minimum for 1440p now.
I don't mind inflation - that's a normal thing on the market and I already accounted for it. But NVIDIA switching everything in pricing by one shelf up is just not ok and doesn't help adoption of new tech. Sadly, because gaming is a small minority in their income, they do not seem to care anymore about this. But we'll see for sure with 5000 series what happens with tech and pricing. Agreed performance-wise, 4070S seems to be also best price-perf ratio this gen. Rumours say AMD will focus on mainstream gamers now, to bring them more perf for the buck, as they realised overcharging people isn't really that profitable long-term. But that's just rumour, we'll see.
 
Sadly the days of getting gpus for the price we used are gone, inflation and so on.

We need a good old-fashioned price war. Remember when new GPUs were launched every year? With Nvidia and ATI battling it out with mainstream GPUs at reasonable prices? It's hoping beyond hope but just maybe Intel will release Battlemage this year and Celestial next year.
 
We need a good old-fashioned price war. Remember when new GPUs were launched every year? With Nvidia and ATI battling it out with mainstream GPUs at reasonable prices? It's hoping beyond hope but just maybe Intel will release Battlemage this year and Celestial next year.
That would be ideal, but so far we had chasing after highest prices possible instead of downward push. NVIDIA is currently de facto a monopoly in PC gaming market, so we start to see results of that, it seems. I can't see Intel making any dent in that either. But then at least AMD and Intel could compete against each other, ignoring NVIDIA.
 
That would be ideal, but so far we had chasing after highest prices possible instead of downward push.

Sadly true.

NVIDIA is currently de facto a monopoly in PC gaming market, so we start to see results of that, it seems.

Also sadly true.

I can't see Intel making any dent in that either.

We can hope. AMD did it with Ryzen; let's hope Intel can do it with Arc.
 
No amount of cheerleading will change perception since RT in its current form is used for increasing IQ cost and the cost outweighs demand.

Until the GPU price setter reduces the buy in cost and or the performance penalty, it'll remain as it is-mostly bolted on as an afterthought.

This is what NV want by design because it helps sell their high end.
 
28mph and 46miles range, that's a Qualcomm Adreno :cry:
With 6000€ purchase price, and less than 250€ for insurance with maybe 50€ per year of electricity you can call it a UNISOC and I still wouldn't care.
What sucks to me is that it's 2024 and I still cannot double my RX 590 performance for the same money...
 
This is what NV want by design because it helps sell their high end.
They've always relied on costly IQ to punt the high end.

Before RT'INGh there was PhysX, then Games Works, they were both heavy on the FPS with the latter pushed as making life easier for devs too.

The reality is they've all been mostly bankrolled additions to games.
 
This is what NV want by design because it helps sell their high end.
Don't underestimate Nvidia buying power though - them being richer then ever now and can afford throwing monies at devs just to push their message across even more than before. And soon after prices increases on GPUs again as they are too cheap, as the CEO claimed few times when talking about 4000 series (including 4090).
 
Last edited:
We can hope. AMD did it with Ryzen; let's hope Intel can do it with Arc.
I'd say Intel and AMD can have fun together as competition but so far Nvidia is just too far ahead - not even with speed and features (both are catching up quickly) but with brand recognition and production capacity. Intel and AMD would have to both invest a lot more into adverts but also actually be able to produce a lot more graphics card than they currently can. If everyone would instantly switch to AMD and Intel graphics cards, both would likely sold out in a blink of an eye and then nothing would be in stock for months, as currently is.
 
Back
Top Bottom