• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

As for Ray Tracing, its more difficult to pin down but there are games where AMD's RT performs quite well and it looks good too, Spiderman Remastered and Watch Dogs: Legion for example and the RT performance on the 7800 XT is just as good as it is on the 4070,

Riftbreaker too - perfect blend imho of RT and raster, as that worked best for this specific game and it works very well on wide range of GPUs. But they created their own engine for it.

I do think its deliberate on Nvidia's part, but not necessarily to suffocate AMD, Nvidia just want you to keep upgrading and spend more on higher end GPU's.
That's obviously true, has been for many GPU generations now. I thought everyone knows it as it's always been a big meme. Seems some people like to ignore it.
 
Last edited:
None of this is really aimed at the average person buying a "gaming pc" in a shop though really, I've never seen any marketing for the latest games using RT talk about RT and the benefits etc in anything that's public facing for the average gamer. It's always in outlets and online where gamers who know tech are. Joe public doesn't care as long as they can play a game, which probably explains why the majority of average gamers are still 1080p according to Steam.
This is exactly what I've been saying for a while here too. Number of enthusiasts that care and buy expensive hardware is miniscule. The question is - do they spend enough to fund this expensive industry now? With hardware perhaps but with AAA games - not really, flop after flop for a while now. Something will have to give
 
Last edited:
It's the non tech average gamer that funds the flop games all too often, pre order after pre order and then the steam forums awash with crying and moaning. Even well optimised games like Indiana Jones is awash with those people just glancing at Steam forums on launch day is enough to demonstrate this. People with potato PCs moaning that they can't run it on settings their systems just can't handle and so on.

if it's not that though then it's the anti-??? brigade who come in crying about female characters not being attractive or how dare the devs have a female lead!!! Whatever else nonsense they come out with.
 
Last edited:
It's the non tech average gamer that funds the flop games all too often, pre order after pre order and then the steam forums awash with crying and moaning.
True enough, pre-order habits need to die. It's not that games run out on release day, there's really no real reason to do it most of the time. I only understand it if people do that in MMOs just to have advantage of few days before the rest comes to play (but tha's pure pay2win, which is all kind of bads in itself). Then again, with Steam one can return the game within 2h of playing or even later if it's really broken - Steam is making it easier all the time, so makes me wonder how many of these preordered games got quickly returned, though.

Even well optimised games like Indiana Jones is awash with those people just glancing at Steam forums on launch day is enough to demonstrate this. People with potato PCs moaning that they can't run it on settings their systems just can't handle and so on.
The question then is - wouldn't that game sell much better if it was better adjusted to lower GPUs, which are the majority? As per above, how many people returned it before even 2h of playing, I wonder. Steam played stats did not look well at all. Many people got it for "free" on GamePass though (like in my case).

if it's not that though then it's the anti-??? brigade who come in crying about female characters not being attractive or how dare the devs have a female lead!!! Whatever else nonsense they come out with.
That's a bad take - yes, such "brigade" exist but these are tiny minority of loudmouths. Majority complains often about gameplay being sacrificed for either graphics or ideology - that's a very different thing and it's mostly correct. This also happen in Hollywood and with TV series - bad design, bad writing, ideology being pushed on viewer and then we have gigantic flops from Disney and other companies. They need to be reminded that it's customer who is always right and who lets companies evolve the industry, not the other way around. Voting with wallets really work, as witnessed often in 2024. Also, as history shown, what really goes forth and sells is not the best but cheaper and "good enough" solution.
 
Last edited:
The trash tier mainstream hardware means devs can only optimise so far for lack of VRAM and tiny generational improvements. The RTX4060 and RTX3060 were 20% improvements over the previous cards. There is no indication the RTX5060 will even match an RTX4070. We will be lucky if the RTX5060TI matches an RTX4070. Same goes with AMD.
Meanwhile, they charge more and more each generation, so they cost per frame is either stagnant or getting worse, not better.

With the mainstream mass market cards falling more and more behind the top,ultimately we can't just blame devs - their lives are made hardly too. We also have to blame greedy companies like Nvidia and AMD who are more worried about other markets now.
Part of it is physics itself - cost of producing new chips grow exponentially, as they have to use more and more advanced (and expensive) materials in components, bigger chips, more layers in the PCB etc. That will increase cost faster than performance, often. That doesn't mean these vendors do not charge for it all more than they could've and still be profitable. But this is why pushing new tech for which market isn't ready is NOT the solution IMHO, quite the opposite.

So,I am not sure what sort of watered down RT experience people are experiencing,with the huge shrinkflation in the mainstream. People can barely run basic RT,let alone more intensive implementations.
Quite a large chunk of games could use even just basic optimisations, it seems. But some things (like PT) can't really be sped up without sacrificing even more image quality/clarity, unless they go for more AI crutches. However, these have their own performance hit and might require new hardware again (even more expensive one), so also not a real solution.
 
(...) it seems RTX50 is going to make that front row with stuff like neural NPC interactions as part of DLSS
Seriously, DLSS = Deep Learning Super Sampling. What does that have to do with NPC interactions and any other AI crutches? They seem to be abusing that name now for stuff that has nothing to do with it. FG was already a stretch but let's say it's just a helping crutch for DLSS. Anything more should be a different tech with different name IMHO.

Personally I want to see Frame Gen lose the performance overhead it has at resolutions above 14440, there is approx a 30% hit to frametime performance with a 4K output (upscaled or not) vs the same FG at 1440p. This points to a performance limitation at 4K output res for FG so if Nvidia have been able to sort that out then that would be huge for further improving FG performance in games that use full RT (path tracing) and especially in UE5 as you then lose the latency issues at 4K which it has even without any hardware Lumen or PT.
The problem with FG is that it's NOT a performance improving tech. It only improves fluidity of the displayed image. There are other technologies that could do potentially a better job (like reprojection keeping camera movement and game responsiveness fluid irrelevant of FPS) but FG seems to have been easier to implement, even though it introduces latency and in some situations makes things worse, not better.

As it stands right now playing a UE5 game with FG results in the base framerate dropping below 60fps which brings the final fps to around 90fps for a 4K output, without frame gen enabled the base fps can easily be 60fps which shows that merely enabling FG has a performance overhead which saps away frametime which then leads to added latency.
Well yes, that's how this tech works - 2 input frames need a bunch of calculations done on them to generate middle frame. Adding latency and costing FPS as it makes GPU busy with stuff other than generating new frames and is slower for bigger frames (like 4k). I don't see how they can speed it up without coming up with a completely new tech - it will always cost some performance, that's just how this tech works.
 
Physx is still used in today’s games, and Control is proof that physx works, look how great it runs with all those particle physics and destruction. Again, this is purely down to optimisation/implementation by the devs. Don't blame the technology, blame the implementation of it.
Physx runs on CPU these days, which is against what NVIDIA tried to push so hard in the past. They advertised it heavily, they gimped it running on the CPU, they pushed it into games, but it had heavy FPS hit (as GPU was busy calculating physics instead of frames) and market said no, we don't want it. So they had no choice but to release it in properly CPU optimised version - and that is what we see in games these days. It's not much about implementation by devs here, it's on NVIDIA. However, it can't be used to advertise GPUs anymore, so nobody hears about it anymore. This is just what NVIDIA does.
 
Last edited:
This is exactly what I've been saying for a while here too. Number of enthusiasts that care and buy expensive hardware is miniscule. The question is - do they spend enough to fund this expensive industry now? With hardware perhaps but with AAA games - not really, flop after flop for a while now. Something will have to give

The number of people who buy expensive hardware is proportionally in line with the number of people who buy expensive in any product category. Tesla initially built a market selling just enough luxurious roadsters so they could finance and build Model S. Then enough Model S so they could fund Model 3s. It's a pyramid of consumerism. You constantly try to downplay the role of enthusiasts in this thread, belittling our opinions and interests as if we do not matter when in actual fact we are playing the consumerist roles that the vendors predict. There's been more shortages of hardware in the last 5 years than there has been overabundance.
 
Meanwhile, they charge more and more each generation, so they cost per frame is either stagnant or getting worse, not better.


Part of it is physics itself - cost of producing new chips grow exponentially, as they have to use more and more advanced (and expensive) materials in components, bigger chips, more layers in the PCB etc. That will increase cost faster than performance, often. That doesn't mean these vendors do not charge for it all more than they could've and still be profitable. But this is why pushing new tech for which market isn't ready is NOT the solution IMHO, quite the opposite.

The issue is that,its quite clear the increases in production cost are outstripped by the actual price rises. The margins of tech companies are generally going up. Even with AMD,take out the consoles(which are lower margins) and the old stock and I expect the same. After all during the Pandemic,lots of other parts were far more expensive than now.

Then you have Nvidia/AMD skirting around each other with prices,trying to prop it all up. Almost like a cartel of some sort.

Also at the same time,gamers seem to be considered mugs in general by companies.


Quite a large chunk of games could use even just basic optimisations, it seems. But some things (like PT) can't really be sped up without sacrificing even more image quality/clarity, unless they go for more AI crutches. However, these have their own performance hit and might require new hardware again (even more expensive one), so also not a real solution.

Hardware companies want to push more overpriced hardware,so they don't care. If anything they are doubling down on locking down these so called "solutions" to new generations and various games developers being quite happy to go along with it. It's so wasteful in terms of the environment too - so it makes me laugh when companies talk about environmental initiatives but also promote planned obsolescence.

In the end the companies who make the games who have to take ultimate responsibility,but as I said above they consider gamers mugs with deep pockets.
 
Last edited:
The number of people who buy expensive hardware is proportionally in line with the number of people who buy expensive in any product category. Tesla initially built a market selling just enough luxurious roadsters so they could finance and build Model S. Then enough Model S so they could fund Model 3s. It's a pyramid of consumerism. You constantly try to downplay the role of enthusiasts in this thread, belittling our opinions and interests as if we do not matter when in actual fact we are playing the consumerist roles that the vendors predict. There's been more shortages of hardware in the last 5 years than there has been overabundance.
Bad take - you seem to have ignored the fact that high end GPUs are not used primarily for gaming anymore. Most sold 4090s didn't end up with gamers at all and hadn't even been priced for gamers. Same will happen with 5090, which is one of the reasons my bets are on much higher prices - these are prosumer GPUs not gaming GPUs (same as Titan cards in the past but even more so now). xx80 class GPUs are considerably cheaper and slower too, since 4000 series and they didn't sell well as per all statistics I've seen (neither gamers nor prosumers wanted them in big numbers). So, what you say about enthusiasts is a really old image - it's not been true for a while now. In other words, pricing of high end is what it is because it's not gamers buying them primarily anymore and enthusiasts have little to no influence on GPU vendors anymore (since gamers aren't even their primary market anymore). All the recent shortages have been caused by either mining (not gamers) or AI (not gamers).
That said, bringing up Tesla who sells the smallest (by far and numbers are dropping recently) number of cars comparing to other producers... not helping your argument either.
 
Last edited:
The issue is that,its quite clear the increases in production cost are outstripped by the actual price rises. The margins of tech companies are generally going up. Even with AMD,take out the consoles(which are lower margins) and the old stock and I expect the same. After all during the Pandemic,lots of other parts were far more expensive than now.

I fully agree.

Then you have Nvidia/AMD skirting around each other with prices,trying to prop it all up. Almost like a cartel of some sort.

Almost like a cartel? :) it's pretty much a monopoly though, by now.

Hardware companies want to push more overpriced hardware,so they don't care. If anything they are doubling down on locking down these so called "solutions" to new generations and various games developers being quite happy to go along with it. It's so wasteful in terms of the environment too - so it makes me laugh when companies talk about environmental initiatives but also promote planned obsolescence.

Again, I fully agree. Though, with waste it's not even about energy use (we live on a planet that has pretty unlimited energy source - sun - just heavily underutilised) but about wasting limited natural resources and generating electro-trash. That's near all companies though, not just GPU ones.

In the end the companies who make the games who have to take ultimate responsibility,but as I said above they consider gamers mugs with deep pockets.

Again, I agree :) Though, in case of gaming publishers they got hit hard back by gamers so they should be starting to learn the lesson soon enough. Hopefully for the better for us and not worse.
 
That said, bringing up Tesla who sells the smallest (by far and numbers are dropping recently) number of cars comparing to other producers... not helping your argument either.

You've missed the point. Their market share is irrelevant, it was their GTM plan that is important - they targeted the high-end first, deliberately.

Bad take - you seem to have ignored the fact that high end GPUs are not used primarily for gaming anymore. Most sold 4090s didn't end up with gamers at all and hadn't even been priced for gamers. Same will happen with 5090, which is one of the reasons my bets are on much higher prices - these are prosumer GPUs not gaming GPUs (same as Titan cards in the past but even more so now). xx80 class GPUs are considerably cheaper and slower too, since 4000 series and they didn't sell well as per all statistics I've seen (neither gamers nor prosumers wanted them in big numbers). So, what you say about enthusiasts is a really old image - it's not been true for a while now. In other words, pricing of high end is what it is because it's not gamers buying them primarily anymore and enthusiasts have little to no influence on GPU vendors anymore (since gamers aren't even their primary market anymore). All the recent shortages have been caused by either mining (not gamers) or AI (not gamers).

On the contrary, the Titan range was marketed for mixed usage and not purely gaming. Back then the uptake was not what it could have been so they rebranded it into the x090 and there's been an upturn in sales since. Mining was a factor across all mid-high end cards during 3000 series, but the AI boom came into play later in the 4000 series release cycle, when local LLMs became more prolific in late 2023 and then through 2024 so you are just reverse engineering that as a reason. Regardless, whether they are sold for AI or not, their intended feature set is to satisfy gaming and the numbers are being met. They will continue to target high end as they have and tick the box that these hit the mark for gamers.
 
Last edited:
On the contrary, the Titan range was marketed for mixed usage and not purely gaming. Back then the uptake was not what it could have been so they rebranded it into the x090 and there's been an upturn in sales since.

Both Titan and 2080Ti had bad pricing for performance ratio - they were only a bit faster than one step lower, whilst being crazy (for those times) expensive. They also didn't have much to offer aside gaming so prosumers users were a small number. Currently (for few generations now) xx90 cards are really primarily aimed at everything else but gaming, with huge amount of vRAM for AI and other non gaming uses, ECC on memory etc. Numbers of people using them for outside-gaming stuff are hugely larger than in the past. Very different situation now, isn't it?

Mining was a factor across all mid-high end cards during 3000 series, but the AI boom came into play later in the 4000 series release cycle, when local LLMs became more prolific in late 2023 and then through 2024 so you are just reverse engineering that as a reason. Regardless, whether they are sold for AI or not, their intended feature set is to satisfy gaming and the numbers are being met. They will continue to target high end as they have and tick the box that these hit the mark for gamers.

I don't agree with any of that and sales numbers confirm what I say. And yes, what I wrote earlier, 3090 was barely any faster than 3080Ti for example, whilst being much more expensive for gamers. It only had advantages in outside gaming use, because of vRAM. 4090 and soon 5090 make it even more clear with huge price difference and outside-gaming use cases. That said, sure enthusiasts well buy them, but with their miniscule numbers they might as well not exist - there's a reason such cards are called halo products, that exist to exist, look good on PR materials but that's about it. They have no pull on actual gaming market (as too small numbers to matter when games need to sell in dozens of millions of copies to make money these days), where actual money talks and not pr and gamers already started up show publishers with sales numbers that they chose poorly.

That said, I find it funny how you said: "You constantly try to downplay the role of enthusiasts in this thread, belittling our opinions and interests as if we do not matter" - you might have missed it but I'm an enthusiast with 4090 etc. and you seem to be trying to belittle my opinions here. ;) On a more serious note, that was a miss too, as that's not what I'm doing here at all.
 
Last edited:
Both Titan and 2080Ti had bad pricing for performance ratio - they were only a bit faster than one step lower, whilst being crazy (for those times) expensive. They also didn't have much to offer aside gaming so prosumers users were a small number. Currently (for few generations now) xx90 cards are really primarily aimed at everything else but gaming, with huge amount of vRAM for AI and other non gaming uses, ECC on memory etc. Numbers of people using them for outside-gaming stuff are hugely larger than in the past. Very different situation now, isn't it?

Titan was the best rendering and simulation GPU you could get on a budget, for those who couldn't afford the workstation cards as they were priced stupidly in comparison to consumer. There was a market for them outside of gaming. As for who the x090s are aimed at... you only have to look at their product page. I'll give you a clue - its' gaming. And it's why they've stagnated at 24GB. They up the VRAM further and they lose out on sales of their AI/datacentre ranges.
Again, these are gaming cards, nvidia's revenue in the gaming product range increased by 56% last year, they know what they are doing when it comes to consumer gaming given it's been their bread and butter since the company was founded. Capitalising on sales for crypto and AI is a pure bonus for them.

I don't agree with any of that and sales numbers confirm what I say.

You have sales numbers that show exactly why top end cards were purchased? I think not.

And yes, what I wrote earlier, 3090 was barely any faster than 3080Ti for example

You talk like gamers had a choice between the two when the 3090 was released...

That said, sure enthusiasts well buy them, but with their miniscule numbers they might as well not exist - there's a reason such cards are called halo products, that exist to exist, look good on PR materials but that's about it. They have no pull on actual gaming market (as too small numbers to matter when games need to sell in dozens of millions of copies to make money these days), where actual money talks and not pr and gamers already started up show publishers with sales numbers that they chose poorly.

Hand waving rubbish. Once again, they are gaming cards and nvidia chalk the sales up as gaming cards. As far as they're concerned, the top ends are the success they expected them to be.

That said, I find it funny how you said: "You constantly try to downplay the role of enthusiasts in this thread, belittling our opinions and interests as if we do not matter" - you might have missed it but I'm an enthusiast with 4090 etc. and you seem to be trying to belittle my opinions here. ;) On a more serious note, that was a miss too, as that's not what I'm doing here at all.

You're not the only one in this thread with a top end card that's swiping at RT.
 
Titan was the best rendering and simulation GPU you could get on a budget, for those who couldn't afford the workstation cards as they were priced stupidly in comparison to consumer. There was a market for them outside of gaming. As for who the x090s are aimed at... you only have to look at their product page. I'll give you a clue - its' gaming.
Ok, I open NVIDIA page for 4090 and I see "Experience ultra-high performance gaming,(...) unprecedented productivity, and new ways to create." - we have 3 primary things listed: one is gaming, 2 are productivity (and one additional one that seems to be VR?). Then you look at the price and feature set (24GB of vRAM, ECC on memory, 2 video encoders, Studio drivers = all of that largely useless for gaming) and you know whom it really is aimed at.
And it's why they've stagnated at 24GB. They up the VRAM further and they lose out on sales of their AI/datacentre ranges.
Ekhm, 5090 is not 24GB of vRAM, it's 32GB. Stagnation happened in lower tiers and it has nothing to do with datacenter, it's all about better margins and forced upgrades sooner than later.
Again, these are gaming cards, nvidia's revenue in the gaming product range increased by 56% last year, they know what they are doing when it comes to consumer gaming given it's been their bread and butter since the company was founded. Capitalising on sales for crypto and AI is a pure bonus for them.
Any graphics card can be classified as "gaming" card - even the data centre ones that have no monitor output and yet are used for cloud gaming for example. You'd hardly call them primarily gaming cards though, would you?

Now, I don't know where you got those numbers from but from NVIDIA website, annual report for 2024 we see it's not a gaming product range, it's Gaming market, which includes their cloud gaming infrastructure, along with other products, that aren't graphics cards. They also do not have "prosumer" market there - that's all included into "gaming" market, which means whatever GPU isn't stricte data centre or Quadro or automotive, is pulled into "Gaming". They only list 4 markets: Gaming, Data Center, Automotive, Professional Visualization. And it's not 56% either, it's 15% year on year increase on Gaming market only. 1% Pro Visualisation, 21% Automotive and 217% Data Centre (which is their primary source of income these days).
You have sales numbers that show exactly why top end cards were purchased? I think not.
Ok, let's look at Steam stats. After 4000 series release for quite a while we had a very small numbers of 4090s there (still higher than 4080s). After initial release and spike (it was the first 4000 series GPU after all and people are very impatient), not much growth. Recently (last months of 2024) quite a bit bigger growth. In Dec 2024: 1.16% 4090, 0.63% 3090, 0.41% 2080 Ti. It still shows considerably more gamers bought 4090 than 4080 (or 4080S) and play Steam games with that, but that is telling about how badly 4080(s) were priced for more typical gamer. Also, number of 4090s in Steam stats grew up mostly after people started selling their used 4090 cheaply with 5000 series release looming. Meanwhile, for a long time after release, 4090 was sold out in most places, or horribly expensive (way above MSRP) - and NVIDIA was selling as many as they could produce. Where are they, as they don't show on Steam stats (till recent growth, which is still relatively small)? We've seen and read about many AI start-ups that bought as many as they could, for example.
You talk like gamers had a choice between the two when the 3090 was released...
When they released, they got sold out to miners, all of them: 90, 80, 70, 60... :) 3080Ti came when mining boom was mostly over, plus it had mining limiter built-in. At that point it was an easy choice.
Hand waving rubbish. Once again, they are gaming cards and nvidia chalk the sales up as gaming cards. As far as they're concerned, the top ends are the success they expected them to be.
Almost ALL GPUs are gaming cards, as I said above. It doesn't mean they're primarily gaming cards. Steam stats tell a story, so does NVIDIA website (as mentioned). NVIDIA calls them Titan-class cards for a reason, plus all the mentioned earlier things. Fun fact, HUB also said clearly in their newest video these are just masked as gaming cards (so are 5090) but they really aren't aimed at gamers as primary buyers.
You're not the only one in this thread with a top end card that's swiping at RT.
What's your point? This thread started with HUB's survey showing most gamers think RT is too heavy on the GPUs. We've discussed here why - partially it's bad optimisation and lazy devs (or more like publishers cost-cutting), partially because things like PT are being pushed too early on hardware that can barely handle it with so-so quality on top GPU (4090). If you have problems with any of these arguments, feel free to post counter-arguments.
 
it really entirely depends on the game, some im happy to take the hit for better graphics mostly RPG/Story based games. Others i refuse to use it as it adds nothing and takes a lot away.
 
What's your point? This thread started with HUB's survey showing most gamers think RT is too heavy on the GPUs. We've discussed here why - partially it's bad optimisation and lazy devs (or more like publishers cost-cutting), partially because things like PT are being pushed too early on hardware that can barely handle it with so-so quality on top GPU (4090). If you have problems with any of these arguments, feel free to post counter-arguments.

RT being too heavy on GPUs is a given. It's not poor optimisation or lazy devs, the technology has been available for decades and prior to 2000 series it was impossible to render real time. You would get cinebench levels of fps before then. Every advancement since 2000's has been to alleviate the burden of RT i.e. it's not being developed from the ground up and being poorly optimised. You can throw brute power at it, or you can employ software workarounds to mitigate the performance impact and this is what we see as the main focus through DLSS, FG, RR etc, because the underlying principle of RT is unchanged. Do we have a long way to go, yes and we've discussed that many times in this thread. Is it a fad? Absolutely not. Will they stop putting focus on improving it because the perception is that "no one wants it" bar a "miniscule amount of enthusiasts" at the top end? No.
 
My anecdote experience is that non enthusiasts really don't care much for ray tracing. They just care if they enjoy the game. When I show them 4090 path tracing, again they think it's nice but they don't seem that impressed.
The way I see it, is that for a long time the jump in gfx per gen was immediately noticeable even to casuals. Nes-snes-ps1-ps2-ps3-ps4, you can immediately see the difference. Ps4-ps5-ps5 pro, the difference is a lot less obvious. I suspect that is why ps5 pro hasn't sold like first expected.
I see people using Crysis as an example, but Crysis really did look way ahead of its time, gfx, physics, destructable environment. It was obvious it was ahead of its time + 30fps on crt is like today's 60. Today's torture tests don't look that much better for the performance loss to many casuals.
 
Really depends on what games you’re playing and what you want from your gaming experience.

If you’re someone who wants the best looking game and playing at high resolution then Ray tracing makes a big difference visually in a lot of games
 
Back
Top Bottom