• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Games devs need to take into consideration the number of people who can run their games unless they want poor sales. So even if RT is permanently on it needs to run on something the level of an RTX2060 at low/medium settings at 1080p and a six core CPU. The reality the shrinkflation in the mainstream will slow down adoption of more intensive RT features like PT.

But if you look at the biggest PC games in terms of revenue and concurrent players they either have no RT,or are playable without RT. There are huge games such as LoL,D4,WoW,Fortnite,etc which we are designed to scale downwards to less powerful hardware.

Just because a few games push the boat,is like saying Crysis was typical of what most games were like in tech/hardware when it was released. Tech demo games have always existed even back to the original Unreal which hammered hardware at the time.
 
Last edited:
Games devs need to take into consideration the number of people who can run their games unless they want poor sales. So even if RT is permanently on it needs to run on something the level of an RTX2060 at low/medium settings at 1080p and a six core CPU. The reality the shrinkflation in the mainstream will slow down adoption of more intensive RT features.

But if you look at the biggest PC games in terms of revenue and concurrent players they either have no RT,or are playable without RT.

Just because a few games push the boat,is like saying Crysis was typical of what most games were like in tech/hardware when it was released. Tech demo games have always existed even back to the original Unreal which hammered hardware at the time.


They can, and RTX 2060 can play Indiana Jones at 1080p on low using DLAA and still be playable even in the jungle, look how good it looks even on low:


The problem isn't RT at all, not a single bit. The problem is poorly optimised games, regardless of RT on or off. Look how Starfield launched, trash tier performance with no RT in sight. Then months later they "found" at least 34% performance with a single patch not long after Todd told the world that all they needed to do was upgrade their PCs more. Meanwhile many gamers laughed it off and said things like "oh that's just Bethesda, it's what they do lol" - Many in this very forum. Some of you let things like thats lide and give devs a pass because it's easier to blame a technology.

Optimisation is the key word.

These gamers are pointing fingers at RT and upscaling and whatever but rarely pointing the same fingers at devs for releasing games in a rubbish state of performance. In fact the opposite, they are pointing at the tech and GPU vendors instead of the devs for games running poorly with higher stuff enabled, case in point is this thread.

Hopefully Machine Games paves the way to a better state of affairs here as Disney want more games like Indiana now and the Motor version of idtech seems to be a big hit.
 
Last edited:
Games devs need to take into consideration the number of people who can run their games unless they want poor sales. So even if RT is permanently on it needs to run on something the level of an RTX2060 at low/medium settings at 1080p and a six core CPU. The reality the shrinkflation in the mainstream will slow down adoption of more intensive RT features.

But if you look at the biggest PC games in terms of revenue and concurrent players they either have no RT,or are playable without RT.

Just because a few games push the boat,is like saying Crysis was typical of what most games were like in tech/hardware when it was released. Tech demo games have always existed even back to the original Unreal which hammered hardware at the time.
Its PhysX all over again or Hairworks, marketing spiel that the casual buyers of cards can latch onto "DLSS", "RT", gotta catch 'em all whether they make their reflections look slightly more shiny or not its all about the marketing hype very few people will have any card that can run RT without slowing their fps to a crawl which is where DLSS comes in, gotta polish that turd hard.
 
Physx is still used in today’s games, and Control is proof that physx works, look how great it runs with all those particle physics and destruction. Again, this is purely down to optimisation/implementation by the devs. Don't blame the technology, blame the implementation of it.
 
They can, and RTX 2060 can play Indiana Jones at 1080p on low using DLAA and still be playable even in the jungle:


The problem isn't RT at all, not a single bit. The problem is poorly optimised games, regardless of RT on or off. Look how Starfield launched, trash tier performance with no RT in sight. Then months later they "found" at least 34% performance with a single patch.

Optimisation is the key word.

That is because instead of using DLSS/FSR/XeSS to improve performance they are using it as a way to avoid optimisations.

The trash tier mainstream hardware means devs can only optimise so far for lack of VRAM and tiny generational improvements. The RTX4060 and RTX3060 were 20% improvements over the previous cards. There is no indication the RTX5060 will even match an RTX4070. We will be lucky if the RTX5060TI matches an RTX4070. Same goes with AMD.

With the mainstream mass market cards falling more and more behind the top,ultimately we can't just blame devs - their lives are made hardly too. We also have to blame greedy companies like Nvidia and AMD who are more worried about other markets now.

Its PhysX all over again or Hairworks, marketing spiel that the casual buyers of cards can latch onto "DLSS", "RT", gotta catch 'em all whether they make their reflections look slightly more shiny or not its all about the marketing hype very few people will have any card that can run RT without slowing their fps to a crawl which is where DLSS comes in, gotta polish that turd hard.

The RT performance on my RTX3060TI isn't very hot in a few newer games,but it's still quicker than the RTX3060/RTX4060 which have 16x more representation on Steam than the fastest card.

So,I am not sure what sort of watered down RT experience people are experiencing,with the huge shrinkflation in the mainstream. People can barely run basic RT,let alone more intensive implementations.

Both companies need to be giving us 30% to 40% improvements every generation,so that performance can be doubled every 4 to 5 years. It's quite clear this is possible,but got to grow those margins(whilst most of their own consumers don't have that privilege).
 
Last edited:
Yeah that is another matter entirely, but like above, lower end cards can run modern games with great graphics that match or beat a PS5 fairly evenly, there are plenty of games that prove this, one has already been demonstrated above. Devs just need to optimise for the most common hardware, that's it, but all too often it seem they playtest on the greatest spec machines and it takes months if not years to get everything below performant.
 
Someone didn't read the OP. The HUB poll shows the highest answer is people saying that they don't use it because it puts a bigger strain on their GPU which aligns with the then Steam hardware survey results as well, which means gamers who know tech, especially so if they're watching outlets like HUB and following them on socials. These are not your average gamer off the high street buying off the shelf PCs. My comment is in context of average gamers who do know tech, not the average high street gamer who just buys a game and a computer or console and just wants to dive right in.
You seem to run around in circles and then eventually agree the developers need to do more. You also seem to think anyone no bothered is some kind of "hater" of RT/Nvidia.

It's the developers and hardware that will drive the tech forward. Everyone else just tags along occasionally upgrading at a price point that suits them.
As time goes by more and more gamers will have access to some kind of RT hardware but the developers still need to scale to the lowest possible configuration(Even on consoles).

So I'm not sure what you expect the "average gamer who knows tech" to do?

Should they spend all their money on new hardware every year (just because an unoptimized game releases and the internet says the RT/PT is unbelievable and can't be missed)?​
Should they rent a GPU from the cloud?​
Should they make their own GPU?​
Should they make their own games?​
 
There is no circles, I've always said games need to be optimised and am often the first to point out launch issues with games unless you've been living under a rock.
 
Perhaps read again, but this time in the context of the exact discussion instead of whatever it is that you want to read.
 
Last edited:
Perhaps read again, but this time in the context of the exact discussion instead of whatever it is that you want to read.
Why not start fresh. What point are you trying to make or how much do you think average gamers that know tech should care about RT? And what should they do about it?
 
Yeah that is another matter entirely, but like above, lower end cards can run modern games with great graphics that match or beat a PS5 fairly evenly, there are plenty of games that prove this, one has already been demonstrated above. Devs just need to optimise for the most common hardware, that's it, but all too often it seem they playtest on the greatest spec machines and it takes months if not years to get everything below performant.

The companies should be optimising for baseline performance without upscaling,then using upscaling as a way to be able to add-in RT features. There is also talk of how companies are just using standard presets in UE5,etc to save money instead of properly doing optimisations. The issue here is these companies are more worried about their margins instead of providing finished products to their customers.

However, as a person who uses mainstream cards(and helps out with a lot of those kinds of builds),a lot of the lower end cards when the PS5 launched were far worse. For example in 2020,when it launched the average cards were the GTX1060,GTX1650 and GTX1660 series with the RTX2060 lower in the Steam rankings. Most of them were obviously in laptops and pre-built systems.

The PS5 GPU is around RX6600XT to RX6700 level performance with a Ryzen 7 4700G equivalent CPU and the XBox Series X is probably closer to an RX6700XT. So for the price it wasn't that bad,especially as the digital version has dropped under £350 a few times.
 
The companies should be optimising for baseline performance without upscaling,then using upscaling as a way to be able to add-in RT features. There is also talk of how companies are just using standard presets in UE5,etc to save money instead of properly doing optimisations. The issue here is these companies are more worried about their margins instead of providing finished products to their customers.

However, as a person who uses mainstream cards(and helps out with a lot of those kinds of builds),a lot of the lower end cards when the PS5 launched were far worse. For example in 2020,when it launched the average cards were the GTX1060,GTX1650 and GTX1660 series with the RTX2060 lower in the Steam rankings. Most of them were obviously in laptops and pre-built systems.

The PS5 GPU is around RX6600XT to RX6700 level performance with a Ryzen 7 4700G equivalent CPU and the XBox Series X is probably closer to an RX6700XT. So for the price it wasn't that bad,especially as the digital version has dropped under £350 a few times.

That field is changing now too as there's less reliance on raw performance and more on what the AI cores can do and it seems RTX50 is going to make that front row with stuff like neural NPC interactions as part of DLSS (a recent Q&A on reddit with an Nvidia person) which is what nvidia were hinting at last year anyway - where 1 NPC is "real" and the rest are rendered by AI so the CPU/GPU has to do less work to compute them. How this works in reality we will have to wait until after CES to find out but it's clear that raw horsepower isn't really that important any more and that there is only 1 big halo card. Maybe this was the end goal all along and it's only now the hardware has reached a level where it can be used.

Like upscaling, as long as the final result is an improvement and a boost in performance then how it gets the output doesn't matter, only the output does.

Personally I want to see Frame Gen lose the performance overhead it has at resolutions above 14440, there is approx a 30% hit to frametime performance with a 4K output (upscaled or not) vs the same FG at 1440p. This points to a performance limitation at 4K output res for FG so if Nvidia have been able to sort that out then that would be huge for further improving FG performance in games that use full RT (path tracing) and especially in UE5 as you then lose the latency issues at 4K which it has even without any hardware Lumen or PT.

As it stands right now playing a UE5 game with FG results in the base framerate dropping below 60fps which brings the final fps to around 90fps for a 4K output, without frame gen enabled the base fps can easily be 60fps which shows that merely enabling FG has a performance overhead which saps away frametime which then leads to added latency.

The vast majority of people don't like FG because of this latency, if the overhead is removed then the baseline fps is always 60 and since FG effectively doubles the framerate at 1440p, 4K should follow suit as well and lower end cards could also see the same gains relative to their power.

Ultimately that would mean everyone could potentially use RT/PT without the problem of frametimes being affected due to the overhead induced by simply enabling FG.

The only real concern is if that "Advanced DLSS" leaked recently is locked to only RTX 50 or not.
 
Last edited:
Ray-tracing looks amazing on an OLED screen. Still too resource-intensive at 4k. Let's see what kind of sorcery NVIDIA have up their sleeves with their new DLSS and 50-series associated hardware/software updates.
 
Again, if frame rate was king, people would not be playing 30fps on consoles, or with some downgraded looks for 60fps, where possible. It would HF2 graphics with at least 120fps. Or move entirely to PC gaming. But that doesn't happen. Besides, not only the GPU is the limit, but rather the rest of the system.

People on consoles have not much choice usually, but last I seen stats (recently) big majority of gamers prefer to reduce visuals to get to 60fps. If they know there's an option that is - my wife plays on consoles and she often is unaware but the moment I shown her 60fps she won't go back to 30, irrelevant of visuals. Visuals just don't matter to her, only comfort and gameplay do. Then again, we have Switch where visuals are really bad and FPS is bad but gameplay is king - these games and console itself sell very very well, way better than any other consoles.

CB77 was running so poorly on PS4 that it was taken off the digital store :))

Which is exactly the point I'm making - cp2077 was initially horribly buggy and unoptimised, pretty much unplayable also on PC (because of horrible bugs, not working quests etc.). It was a very rushed game. But then Devs sat down and actually optimised a lot of things and suddenly it's playable and has very little bugs left. If only that was the state it was released instead of rushing it through the door. Whole optimisation isn't a problem of "can't be done" but a problem of "why bother, people will buy anyway". But as CP2077 shown, people won't buy it till it's fixed and payable on their chosen platform. Eventually game sold very well but a lot of copies sold later, after it was fixed.

The above also was called a PT game. I'm not sure that it is, but at least is RT. Oh, and is on UE5.

Majority of sales happened in China and on average they have potatoes for machines :) Still, it's national pride there to support their games like that (my wife is Chinese and bought it even though she will never play it). However, it has a very good gameplay and it even scales very well down to bad hardware without losing much of visuals. Ergo, it works well on bad hardware and it's a very good game - no surprise it sold well. It's also a rarity amongst AAA games with such visuals.

Stalker has no online component, is just SP.

Does it now? Not a game I played myself, so it seems my sources were talking out their behind and that's my bad for not double checking. Good to know, disregard what I said about it then, you are most likely right here, instead.

The development of graphics isn't the issue, some fail because they're seen as a business, ran by business people too much.

Exactly. So instead of coming up with a fun game, they rehash old stuff in new skin and then don't even bother to optimise or debug it well, release and move on to the next one, whilst complaining it cost so much to make games so they have to increase prices. Really bad road but they are slowly learning with flop after flop. Gameplay matter, graphics really isn't anywhere at the front of requirements for most gamers, by all stats I see.

Outside of the ghosting issues, DLSS (to me), solves the muddiness under some form or another.

It's not the DLSS itself (though it does add sharpening yet that doesn't recover lost details), it's mostly super sampling, as we established earlier. :) Most people don't use it though (I don't, too fiddly and too many issues with it for me to bother). And it should not be requirement to get proper image clarity.

Crysis is a good example, because even though hardware was relatively cheaper then, people would still buy as cheap as possible and often times I've heard the "is just a graphics demo, there's no real game there" mantra.
Crisis engine was really badly optimised and still is - hence even on newest hardware is doesn't work as good as one would expect. That's been confirmed by it's own Devs multiple times over the years. Which is why further games with that name were made quite a bit differently. Still, even newer ones had stupid issues like huge tessellation on flat objects, or water that wasn't visible yet calculated for whole map etc. - for which people blamed Nvidia but I see mostly incompetence or rushing to release.

That said, crisis was very unique with physics and gameplay, which helped it much more than graphics. Engine issues were mostly meme with "but can it run crisis?" :) not something I want to see in properly made games.
 
Last edited:
Errr... once again surprised seeing that Path Tracing resembles Screen Space Reflections Medium (but mostly worse quality) rather than RT Ultra. Getting 24 fps here :cry:

This is the thing most people aren't aware of - SSR is a very simplified (as calculated only for the screen and with many shortcuts) RT algorithm. Always was, hence was such a performance killer back in the days. It can produce very good resolution and details reflections but it has the big downside of only showing what's visible on the screen.

Raster has quite a few of such algorithms that are based on simplified RT and very well optimised for the job - hence bashing raster and claiming RT is better is a bit silly, as raster in many ways already is RT and not just baked lights etc. :) PT is a proper step up, not the standard RT IMHO. But cost an arm and leg in terms of performance and image clarity (as it's low resolution and noise eating details to keep it usable).
 
Last edited:

I'm 10 seconds in and already he's made a mistake, Nvidia didn't introduce Tessellation, ATI did in 2001 with the Radeon 8500.
Crysis 2 was sponsored by Nvidia, the game did have assets with the tessellation cranked to about 100X more than necessary.

PhysX was created by NovodeX AG in 2004, NovodeX were acquired by Ageia who developed the technology to run on CPU's, Nvidia acquired Ageia in 2008, i remember Toms Hardware tested PhysX on the newly released Phenom II X6 1090T vs the GTX 480 and the CPU ran it faster :D soon after than Nvidia launched a PhysX patch that gimped it to only a single thread on the CPU.

Hairworks was in response to AMD's TressFX (original Tomb Raider remake) Hairworks was first used in The Wither 3, Geralt's hair was tessellated to 64X per pixel, you can't see tessellation beyond 4X per pixel, they look identical.

As for Ray Tracing, its more difficult to pin down but there are games where AMD's RT performs quite well and it looks good too, Spiderman Remastered and Watch Dogs: Legion for example and the RT performance on the 7800 XT is just as good as it is on the 4070,

The truth is at the time ATI / AMD's Tessellation was not as good as Nvidia's, Nvidia dedicated more die space to it, that's no longer true, AMD's tessellation is just as good now but now we have RT, again Nvidia dedicate more die space to it and its better, but with both of these things its only better when its saturated to a certain level which Nvidia seems to like to do, its not just that AMD's GPU's suffer for it at those levels Nvidia's own GPU's do to, what that means is the lower end GPU's, if you can call an £800 4070 Ti S a lower end GPU.... its running at 20 FPS with it all on in Cyberpunk, yeah its 15 FPS on the 7900 XTX but its also not doable on the 4070 Ti S, you need a more expensive Nvidia GPU, certainly the latest one, your 3090 Ti isn't going to cut it.....

I do think its deliberate on Nvidia's part, but not necessarily to suffocate AMD, Nvidia just want you to keep upgrading and spend more on higher end GPU's.
 
Last edited:
Ultimately nothing really matters for what people may or may not think. RT is the future and nearly every developer now uses RT in most of their games

Nearly every developer used RT in most of their games for over a generation now - raster is full of RT algorithms for effects like SSR, shadows, lighting etc. They just didn't call it RT as people didn't care. Now RT is a marketing word, hence every big publisher jumped on it. Soon it will be AI and that's what you will see in marketing and not RT. Whatever new buzzword will sell.

with most devs now stating that it makes their lives easier with some saying they simply will not be going back and using raster any more (Machine Games most recently after Indiana Jones).

That's good for them, completely irrelevant for the end user - whatever works for Dev, as long as there's advantage for the gamer. But so far we didn't see any price reductions (rather the opposite) even though it supposedly lowers production cost for the publishers.

If people want to be able to play the latest games with RT providing the better picture, then a baseline GPU spec is needed. Simple as that. Either upgrade or stay left behind.

And there I thought publishes, like any other producer, want to produce something that sells well and make money on that. But what you say sounds like they do gamers a great favour and gamers need to keep up or not... give publishers money? What sort of business strategy is that? :D Maybe there's a reason Ubisoft is pretty much bankrupt and other publishers are closing studios left and right, meanwhile indie games sell better and better each year. :)

People can debate it all day long but it will mean zero because the industry doesn't care about old tech (raster) any more
And people don't care what industry care about - they will vote with their wallets. And so far all I see is flop after flop of AAA games based on graphics alone. Such mystery! ;) Maybe if they start with good gameplay and then add graphics instead of stopping on just graphics these games will sell better. Just a thought. :)
 
Last edited:
Back
Top Bottom