• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

In games that use it well its great.. which is like 4-5 games? It's nice in Cyberpunk, Alan Wake, or Metro EE but in others like Resident Evil games, Dead Space or Elden Ring there is no point even turning it on.
I suspect most future RT implementations will be half assed and not worth using (like in most games now) until the next generation of consoles are here.
The list of games that look better with RT is a lot higher then 4-5. We are up to 138 games and 72 applications that use RT which is not even counting mods that give RT to old games like Half Life 1, Half Life 2 *, Fear, Deus Ex, Serious Sam, Quake, Doom, Portal and all the others which are great for replying old games. That's 11 games right there between your post and mine and there are many others.

* ok this one is not out yet but its due "soon" and is a good exmple of what I mean by replaying old games with RT. https://store.steampowered.com/app/2477290/HalfLife_2_RTX/
 
Last edited:
And here we go, this could basically be the end of the discussion debate crying arguing on RT vs Raster (and native vs upscaling for that matter) - As Linus says, we have now just lived through RT's infancy and seen the issues, but as of right now, the real gains are being seen and native rendering can essentially be forgotten because it's never going to be a "thing" again.

You should watch the video at half speed.

He said basically what the majority of people in here have been saying the last few years. We're getting there but we're not just there yet.

Paying premium to beta test isn't really what gamers wanted.
 
You should watch the video at half speed.

He said basically what the majority of people in here have been saying the last few years. We're getting there but we're not just there yet.

Paying premium to beta test isn't really what gamers wanted.
He also said developers don't really save time and money as he was told by insiders - what they save on RT they lose on optimisations and testing required to make these games run on the majority's hardware (xx60 class GPUs). In other words, it's not helping majority of Devs currently, it's not helping majority of players currently. Who is it helping then? I know one company that created this "gold rush" and is now earning a lot on selling "shovels" for it. ;)

It will get there, but not for many years still. We are forced to pay for the early access - which I never do for games normally but they want to make it a no choice. By the way the whole neutral thing - Microsoft is working on new direct X that will work on all GPUs for that. This means years still (it's Microsoft!) before it's in DX and even longer for games to widely implement it. Buying 5k series one can get again just access to early access at best, we'll see wider uses likely way after 6000 series release or later.
 
Last edited:
You should watch the video at half speed.

He said basically what the majority of people in here have been saying the last few years. We're getting there but we're not just there yet.

Paying premium to beta test isn't really what gamers wanted.
Watched it at 1.5 speed to get it over with :cry:
It's actually one of the better videos from Linuscorp I've seen (been a while to be fair).

RT is being forced on us by Nvidia. I would have been happy with 100% of the silicon on my GPU being used for raster, with maybe the more high end options including tensor cores to help with local AI and other such non-gaming workloads. They could even label them differently (say, GTX, and GTX-Pro).

In an ideal world, RT hardware and software would be developed in tandem and released when it's ready and not so costly (whether that cost is monetary or in terms of the hit on the PC running it). It's not like we've acheived photo-realism and need to explore new avenues. This comes across like trying to build a fifth floor on a building that still hasn't had it's ground floor completed.
 
We will see also judging by the 50 series sales from the pro posters. As the gains this gen are mostly from ray tracing and dlss it should be what the people are happy with because it makes the features they want 40% better right?
 
The optimisation notes are slightly bs, yes he says the devs told him that it affects the middle class cards and below but that's the bs part because the lack of optimisation affects EVERY card. It's nitnacase of RT affecting the optimisation in games using unreal 5 for example, that's literally optimising your game around the engine which Epic provide all the tools to do.

So no that excuse doesn't fly given the mounting evidence throughout the years.

Shader compilation stutter has nothing to do with optimisation because of RT, game breaking bugs have nothing to do with RT and so on so these are all things the exec branch of devs said "you can patch it after release we have deadlines to make" which has been pretty obvious for the most part.

Do you not find it amusing that the Nvidia branch specific release tend to always release well optimised meanwhile the general releases are more often than not unoptimised?

Makes you think....
 
Last edited:
It seems to me that the path tracing option is an "Immediately destroy my GPU toggle".

It's quite reminiscent of HairWorks. The performance penalty becomes laughable when you are seeing over 100% performance loss for Path Tracing in UE5. Now it certainly doesn't make the games look double as good.

At which point do we realise we've been had again by NVIDIA ;)
 
The optimisation notes are slightly bs, yes he says the devs told him that it affects the middle class cards and below but that's the bs part because the lack of optimisation affects EVERY card. It's nitnacase of RT affecting the optimisation in games using unreal 5 for example, that's literally optimising your game around the engine which Epic provide all the tools to do.

I don't see how it's bs. Yes, obviously optimisation will affect (positively) high end GPUs, but these can get away with less optimisations using dlss and other trickery, whereas lower end GPUs can't rely on DLSS alone. That's the main reason devs even bother to do it, otherwise they would just save money and say "just buy faster GPU" - but they can't because people simply won't.

Shader compilation stutter has nothing to do with optimisation because of RT, game breaking bugs have nothing to do with RT and so on so these are all things the exec branch of devs said "you can patch it after release we have deadlines to make" which has been pretty obvious for the most part.

I don't think you've seen how some games work before release, before at least the basic stuff gets optimised - it's a horror story, when one looks at leaks or early access. I've had access to a bunch and I don't wish that on anyone. All the bugs and problems we see on release are already much better state. So yes, they do some optimisation before release as they have no other choice - but only the bare min. they are forced to do to not get all the purchased copies returned instantly afterwards (thanks to Steam and other stores return policies). :) Still, I hope publishers will feel bigger pressure to actually finish the game before releasing it.

Do you not find it amusing that the Nvidia branch specific release tend to always release well optimised meanwhile the general releases are more often than not unoptimised?

Makes you think....
That's what happens when someone comes and pays for optimistic - Nvidia in this case. Things then happen as it's not publishers paying for it. If they have to pay, that do only bare min. in most AAA games.
 
It seems to me that the path tracing option is an "Immediately destroy my GPU toggle".

It's quite reminiscent of HairWorks. The performance penalty becomes laughable when you are seeing over 100% performance loss for Path Tracing in UE5. Now it certainly doesn't make the games look double as good.

At which point do we realise we've been had again by NVIDIA ;)
The thing you see in UE5 hardware lumen isn't even full PT that people see in some other games. Neither is the one in Indians Jones - they all have settings where you can lower not just number of rays but adjust many other things. That's not full PT, that's their own implementation with hybrid PT and RT and other things. And even that isn't very performant yet in many cases.
 
But these can get away with less optimisations
Star Wars Outlaws has entered the chat along with Jedi Survivor and many other games joining in tail. Stutter doesn't care about GPU, you can't brute force away stuttering, that's not how it works.

If people think that suddenly these news cards or MFG will negate the need to optimise, then they have a world of wisdom coming lol.
 
Last edited:
Star Wars Outlaws has entered the chat along with Jedi Survivor and many other games joining in tail. Stutter doesn't care about GPU, you can't brute force away stuttering, that's not how it works.

If people think that suddenly these news cards or MFG will negate the need to optimise, then they have a world of wisdom coming lol.
Oh yes, of course, but that's not even GPU related, that's just bad programming and ignoring UE guidelines - we had this in ue3, ue4 and now ue5 too. New GPUs can't fix that, it's something Devs have to learn finally and fix. Epic also won't fix it for them as it's not engine issue, it's Devs laziness issue.
 
Oh yes, of course, but that's not even GPU related, that's just bad programming and ignoring UE guidelines - we had this in ue3, ue4 and now ue5 too. New GPUs can't fix that, it's something Devs have to learn finally and fix. Epic also won't fix it for them as it's not engine issue, it's Devs laziness issue.
The UE5 engine is the main issue since it was built primarily for Fortnite where almost everything in the game is coded to be destructible, once you get a game with better graphics the performance tanks because you still have that element of everything being destructible hard coded in even in games where only a few objects are destructible.
 
The UE5 engine is the main issue since it was built primarily for Fortnite where almost everything in the game is coded to be destructible, once you get a game with better graphics the performance tanks because you still have that element of everything being destructible hard coded in even in games where only a few objects are destructible.
As far as I'm aware, UE5 default settings are in general set to work best for games like Fortnite and similar, not for all games in general. That is why devs have tools, and guides available to change it all as needed for their specific games - but in huge majority of cases they just leave it all default and then say it's not them, it's the engine. That's just so bad in so many ways - they have development budget with dozens if not hundreds of millions of USD and can't do simple things like that... :/
 
I doubt that the majority of people with 8Gb cards will be rocking an 8 core 16 thread CPU. Sounds like an unbalanced system.
8c16t can be rather cheap.

I can't find (or perhaps it can't be done in my configuration) to disable some cores on my CPU, but without the Hyper Threading enabled (so 8c8t), seems to be good too, even while hitting 100% load.
Instead, you can see it more as a nicely multi threaded optimized engine.

 
Last edited:

"We also took the idea of ray tracing, not only to use it for visuals but also gameplay," Director of Engine Technology at id Software, Billy Khan, explains. "We can leverage it for things we haven't been able to do in the past, which is giving accurate hit detection. [In DOOM: The Dark Ages], we have complex materials, shaders, and surfaces."

"So when you fire your weapon, the heat detection would be able to tell if you're hitting a pixel that is leather sitting next to a pixel that is metal," Billy continues. "Before ray tracing, we couldn't distinguish between two pixels very easily, and we would pick one or the other because the materials were too complex. Ray tracing can do this on a per-pixel basis and showcase if you're hitting metal or even something that's fur. It makes the game more immersive, and you get that direct feedback as the player."

Ray Traced hit detection :D
 
  • Like
Reactions: mrk
Wow. That actually sounds like an potentially more functional use of RT. Going to be interesting to see where this could go.
Star citizen, at some point, was to use RT to determine the obstacles ahead of the characters feet in order to help with proper animations for traversing the world.

Anyway, with balistic weapons at least, I'm curious if there is a parabolic trajectory as it would be in real life and not just a straight line (like the rays are usually done for RT).
 
Star citizen, at some point, was to use RT to determine the obstacles ahead of the characters feet in order to help with proper animations for traversing the world.

Anyway, with balistic weapons at least, I'm curious if there is a parabolic trajectory as it would be in real life and not just a straight line (like the rays are usually done for RT).

Usually with a game a ballistic object is just straight line collision detection along its current direction vector for the distance from the previous frame to current based on its velocity. For most objects I doubt the accuracy error at a decent server frequency update rate is out enough to matter in a game.

If you were doing accurate physical modelling like real world tracking and predicting an object in space it is probably handled completely differently mathematically.
 
Back
Top Bottom