• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

Hmm shame as the account is not perma'd so must have got yeeted from the GPU section outside of him arguing with other posters he kept the RT threads intresting now its pretty much all whinging and complaing about it.

Indeed. I do think the RT sections in particular have suffered for it. But it is done now and my understanding is no undoing it.

Personally I would like to see Nexus18 and LtMatt return. Sure they may have made mods life hard at times, but they added a lot to the forum also.

Maybe @mods we could put a poll up and vote for it?

Poll:

Should Nexus18 and LtMatt have their GPU privileges reinstated?

Yes
No


Probably won't happen though.
 
Indeed. I do think the RT sections in particular have suffered for it. But it is done now and my understanding is no undoing it.

Personally I would like to see Nexus18 and LtMatt return. Sure they may have made mods life hard at times, but they added a lot to the forum also.

Maybe @mods we could put a poll up and vote for it?

Poll:

Should Nexus18 and LtMatt have their GPU privileges reinstated?

Yes
No


Probably won't happen though.

I don't like seeing permabans, everyone should have the opportunity for redemption, especially if they are long term regulars, i agree both of them also added a lot to the forum, look... the history between me and Nexus18 is well documented and i vote bring him back, 100% bring him back, bring them both back. Given them another chance.
 
I don't like seeing permabans, everyone should have the opportunity for redemption, especially if they are long term regulars, i agree both of them also added a lot to the forum, look... the history between me and Nexus18 is well documented and i vote bring him back, 100% bring him back, bring them both back. Given them another chance.

Exactly this. I have also had arguments with you also and can find some of things you say bizarre at times. Would I find it funny if you got banned? Yes, for a second, but would I want you perma banned? Nah. This place would get boring without the regulars imo.

Just look. Kaapstad leaving has been a loss. I enjoy his threads.

Perma should be left for people posting scams or some other weird material imo. Others should just get longer and longer bans if they break rules, that would be enough imo.

My opinion anyway.
 
+1 to the return of LtMatt and Nexus18 (the latter I didn't know about: his ban has probably cost this forum 100's of posts already!). Yes, I understand why they were frustrating for mods, but a sinbin for x amount of time would surely be enough rather than an outright ban?

I agree that they both brought a lot to the forum.
 
The last game I've seen with full RT and good performance that justify it, along with no accompanying issues (lagging GI, noise all over the place, very blurry image caused by TAA on most effects etc.) was Metro LL - that IS a benchmark for me of what's possible. More modern games, they turn on RT/PT and that's it. Are they technically better? No, in most cases it's a regression in my eyes - there's no better physics, no better AI, no more realistic RT effects, there's just much higher system requirements. Did RT suddenly become more computationally expensive or is it a case of lazy devs using off the shelf solution without actually doing any work themselves? From NVIDIA article about said Metro "Ordinarily, this level of detail would require gigabytes of system memory and GPU VRAM, but thanks to a highly efficient streaming system Last Light’s world uses less than 4GB of memory, and less than 2GB of VRAM, even at 2560x1440 with every setting enabled and maxed out.". I don't think I need to add more here. We're not moving forth with graphical fidelity, we're moving forth with laziness of devs and people like DF excusing it, instead of pushing for better development, along with blaming gamers for having too weak hardware. The audacity of that, when we know already current hardware is fast enough to handle it, just horribly utilised. Then 5k series arrives and what will we get? Even less FPS in next games, with same overall fidelity, I can bet. Because 30FPS is more cinematic? :)
Can you imagine running modern game looking like Metro LL on 2GB vRAM? :)
Metro LL is the 2nd installment, only Metro Exodus has RT and Enhanced Edition is the one with the best implementation.
At the time Metro LL launched, almost all cards were 1GB and 2GB cards (I think only 7950 and 7970 had 3GB) and Titan OG at 6GB, but that didn't matter anyway. In today's terms it would be around 12-16GB vRAM usage if you compare it to what pool size is available. 2GB now is irrelevant, so the question would be, what today's games can do around 10-16GB area at 1440p and the answer is... plenty.
 
Metro LL is the 2nd installment, only Metro Exodus has RT and Enhanced Edition is the one with the best implementation.

You're absolutely right, I was thinking constantly the Exodus and writing LL - totally my bad here. My point about RT implementation stands, though - it's still my benchmark of that is possible with sensible performance and quality.
 
Unreal Engine and other baked deferred rendering game engines strike again with their Vaseline slop
That's one of the main complaints I have and it stems directly from the fact current GPUs are nowhere near fast enough for full PT - you have to have a lot of shortcuts, which results with horrible noise and blurriness plus lags, ghosting etc. in movement, all caused by temporal accumulation. This is one of the reasons I think, that Jensen H. said himself GPUs aren't and can't be fast enough unless they also use various AI on top to correct everything. We've already seen a bit of that, I expect even more with 5k series to be shown. Problem with that is that hardly any games use what we already have, so many years more will pass before we see it more widespread.
 
Last edited:
Poor quality characters? That's nonsense.
I'm talking about in motion, who gives a **** about screenshots. And that's the issue, the uncanny valley effect is more pronounced even though "on paper" they're higher quality. The reality tho is that they're nowhere near the quality of Last of Us 2, during cutscenes, and that was made years ago on way worse hardware. That IJTGC holds quality constant between gameplay & cutscenes isn't a plus to me because I'd rather them juice it up for cutscenes so that it looks more convincing instead of uncanny (and the closer you get to realistic the quicker that effect comes into play). Hell, Xbox's very own Halo already proved this choice to be better with what they did for Halo 5 cutscenes. In fact, even if we go for open world fp games, we already have Cyberpunk doing a better job for their main characters (in gameplay too!), and given the vastly greater scope of open world they could've been excused for not being top tier. Just during cutscenes there's also the Spiderman games doing great in terms of animations (but worse for the character quality). The list can go on and on.

This lesson has already been learned many years ago, if you can't deliver really convincing realistic characters (because you can't afford to mo-cap all of em) then go for a stylised approach instead, and let the brain fill in the gaps.

The worst part for IJTGC is that the side-by-side with the original, which they made the mistake of aping 1-4-1, where you can see just how stilted the animations are and how they can't express emotions like the human actors obviously can:


If we're talking about the best, then unfortunately Machine Games + Bethesda simply isn't making the cut.

Unfortunately I don't have my own captures but these gifs give you an idea.

40643a1930500e07440b817b52525ecf.gif


what-brings-you-here-evelyn-parker.gif

JsaViMv.gif
 
Indeed. I do think the RT sections in particular have suffered for it. But it is done now and my understanding is no undoing it.

Personally I would like to see Nexus18 and LtMatt return. Sure they may have made mods life hard at times, but they added a lot to the forum also.

Maybe @mods we could put a poll up and vote for it?

Poll:

Should Nexus18 and LtMatt have their GPU privileges reinstated?

Yes
No


Probably won't happen though.
Makes sense now after seeing RT spammed in the pc games section.
 
  • Haha
Reactions: TNA
None of what you said about in motion is observed in this. Watch any of the videos posted by myself or others and see for yourself. You are imagining an issue that isn't an issue.

RT off for me
You cannot turn RT off in any UE5 game, and some more recent games not using UE.
 
Last edited:
I have to say, I accidentally turned on Ray Reconstruction in CP2077 once and I thought my GPU was broken. The boiling, grain and ghosting was awful.
Yep, one of the AI algorithms that actually is a must to have any sensible visual quality with full PT, it seems. Still near not present in other games, sadly. That said, there are many modern games not using full RT which have similar issues with noise, because the effects they implemented are just of really bad quality - like Black Ops 6. Main reason I never finished it, as so much noise and sizzling everywhere and game is often in dark areas, hence it's very visible. Awful visual thing to me!

That said, HUB pretty much started the video with summarising of what I tend to repeat - GPUs need to become MUCH faster for RT/PT implementations to actually look good, as even best NVIDIA's AI solution to date (RR) is far from perfect and it's not even present in most games yet.
 
Last edited:
Yep, one of the AI algorithms that actually is a must to have any sensible visual quality with full PT, it seems. Still near not present in other games, sadly. That said, there are many modern games not using full RT which have similar issues with noise, because the effects they implemented are just of really bad quality - like Black Ops 6. Main reason I never finished it, as so much noise and sizzling everywhere and game is often in dark areas, hence it's very visible. Awful visual thing to me!

That said, HUB pretty much started the video with summarising of what I tend to repeat - GPUs need to become MUCH faster for RT/PT implementations to actually look good, as even best NVIDIA's AI solution to date (RR) is far from perfect and it's not even present in most games yet.
My graphics settings reset and it turned RR on with just regular RT, not PT. I'm on a 3080 so PT isn't viable in the slightest. Turned it off and everything looked ok again. Took a good bit of googling though!

Agree with the last statement. Nvidia need to stop trying to nickel and dime gamers and almost have a 'reset' generation where mid range GPUs can catch up in terms of hardware RT performance, because at the moment the few games that have PT are barely playable on the flagship, and this is unlikely to change.

Unless the 8800xt turns out to be a PT monster... heh.
 
Unless the 8800xt turns out to be a PT monster... heh.

It won't and it's simple - current GPUs have only few rays per pixel and that's the best that's currently possible. To not have any of the noise issues your need over 1000 rays per pixel at minimum. That's not gonna happen when GPUs anytime soon (few hundred times faster PT processing). AI will have to step in if they want to make it viable, instead. Somehow. Currently RR seems to be great thing there is but even that is struggling a lot. It's like struggling to create property 4k image whilst upscaling from 240p - is just can't happen, till input resolution is considerably higher.
 
Last edited:
It won't and it's simple - current GPUs have only few rays per pixel and that's the best that's currently possible. To not have any of the noise issues your need over 1000 rays per pixel at minimum. That's not gonna happen when GPUs anytime soon (few hundred times faster PT processing). AI will have to step in if they want to make it viable, instead. Somehow. Currently RR seems to be great thing there is but even that is struggling a lot. It's like struggling to create property 4k image whilst upscaling from 240p - is just can't happen, till input resolution is considerably higher.
Right, makes sense.

Hardware RT performance still needs to catch up though, and it is possible to close that gap given the leaks regarding the 5090. There's a massive artificial gap there between that and the 5080. Old Jensen could be charitable and make the 5080 the 5070 and there would still be plenty of space between the proper 5080 and the 5090 for tis and supers and whatever mid-cycle refresh.
 
It won't and it's simple - current GPUs have only few rays per pixel and that's the best that's currently possible. To not have any of the noise issues your need over 1000 rays per pixel at minimum. That's not gonna happen when GPUs anytime soon (few hundred times faster PT processing). AI will have to step in if they want to make it viable, instead. Somehow. Currently RR seems to be great thing there is but even that is struggling a lot. It's like struggling to create property 4k image whilst upscaling from 240p - is just can't happen, till input resolution is considerably higher.


I don't even think engines scale that high; I've seen mentioned of being able to set 4 rays and bounces per pixel, can any game engine do 1000?

And AMDs recommend setting for its GPU is still: 0.5 or 1 Rays and bounces per pixel and 0.25 resolution scale factor

To do 1000 rays and bounces per pixel and 1.0 resolution scale factor and you need a GPU probably 2000 to 4000 times faster than a 7900XTX - this will never happen in our lifetime unless we are able to game on quantum computers, or a cloud based aggregation solution or you can fake it with AI
 
Last edited:
I don't even think engines scale that high; I've seen mentioned of being able to set 4 rays and bounces per pixel, can any game engine do 1000?

And AMDs recommend setting for its GPU is still: 0.5 or 1 Rays and bounces per pixel and 0.25 resolution scale factor
Recommended for the performance reasons, yes. Obviously, we'll never get 1k per pixel in real time, that's not even feasible with any imaginary tech coming in the next 10-20y, considering physics' limits. But, 10+ rays per pixel and proper resolution could be enough for the AI to push it forth with sensible results, without so much temporal accumulation needed (and all the issues coming with it). Offline rendering engines, like Blender, advise 64-128 per pixel are good enough for quick preview (still very noisy) and final image 500-1000. However, for actually good production quality it should be even over 2000. It amazes me some around here imagine that we can get anywhere comparable quality with just few per pixel, even with all the fancy denoising and AI included. Something has to give, one can't cheat physics (but one can self-delude).

I still don't see how 5090 could be few times faster in RT calculations than 4090 for example, though. And even that would likely not reflect well in low-mid range GPUs either (xx60 class). Unless you go for generative AI that just makes up stuff roughly resembling what engine spits out - but do we really want that to happen?

To do 1000 rays and bounces per pixel and 1.0 resolution scale factor and you need a GPU probably 2000 to 4000 times faster than a 7900XTX - this will never happen in our lifetime unless we are able to game on quantum computers, or a cloud based aggregation solution or you can fake it with AI
Exactly, which is most likely one of the reasons NVIDIA's CEO underlines it's not going to happen by brute force, it will have to be something AI-based instead. But the main point is - we're not there yet, by far not. There are big number of visual issues and one can (as I do) argue that we likely took a wrong turn, pushed by marketing into this blind road, instead of thinking this through properly first. Not baked-in lights (it really isn't black and white like that!) but likely things based on voxels - like NVIDIA promoted before RTX times, which already passes lighting accuracy tests (same like RT does) but doesn't have such huge performance and image quality hits. In other words - smarter approach, not trying to brute force like this.
 
Last edited:
Back
Top Bottom