• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

I turn TAA off in most games as it usually looks awful.
Problem is you think you did but you didn't. In most modern games (RT or not) it's always on, on many effects - if you mod such game to turn it off you get horrible artefacts like on the video I linked a bit higher. People might think they use DLAA or DLSS or no AA but TAA is almost always there, on, eating details, causing blurry image etc. That can be only fixed by optimising games better so they don't have to use much lower resolution effects and mask that by using blurry TAA. Oddly, it seems to be really hard to explain that to some people who don't see it as an issue. :) RT/PT doesn't fix such problems as it's not a problem with lighting.
 
Last edited:
(... 8GB)
Ok, first you lump everything into 8GB problem - that's not how I put it. There are 8GB issues - very well documented and known by now, with multiple videos by Daniel O., HUB and other known channels, hence no need for me to give specific examples, as said channels shown those in details. By the way, there's also a few showing how 3060 12GB beats easily 4060 8GB in I.J. as well. That aside, there are also pure performance issues with PT in mainstream cards being mostly just unusable (unless you cut it down, but then it's not actually PT anymore, is it?). Not the same thing, but both issue touch mainstream GPUs.
TAA doesn't really matter, gamers don't care about graphics that much or else they'll buy a (sufficiently) faster hardware. :P
Ok, there's a big difference between cutting down details to improve performance on lower end hardware and muddling details even on top hardware needlessly - neither are RT reliant. Fun fact, TI in newest video dismantled frames of another game and commented about optimisations in some areas allowing RT implementation without increasing frame time - there goes yours (and others) theory of him not liking RT, where he clearly states he loves and wants to use RT (where and when it makes sense). :)

To summarise, you seem to live in an odd denial state, where you claim don't want optimisations in games because... you want them to look muddy? And you seem to prefer them to require, for the same level of visuals, as high-end GPU as possible. Very weird form of masochism that, but you do you, I give up on talking sense to you. :)
 
Said someone who hasn't been paying any attention to anything the last couple of years :p
 
IMO it is developers who aren't ready for it not the hardware or software - pure path tracing with the most recent advances is a lot better than most people think and pretty much any game shows off.
 
And here we go, this could basically be the end of the discussion debate crying arguing on RT vs Raster (and native vs upscaling for that matter) - As Linus says, we have now just lived through RT's infancy and seen the issues, but as of right now, the real gains are being seen and native rendering can essentially be forgotten because it's never going to be a "thing" again.

 
Last edited:
RT is the worst thing to happen to gaming in a long time. Hardware simply isn't ready for it and the "improvement" largely seems to be every surface covered in mercury puddles.

If your hardware cant handle it then disable RT. Then when you replay your old games in 5 or more years, you can enable RT and PT. I love going back to old games and enabling the settings I could only dream of when the game was new. It makes games future-proof instead of needing a remaster later on.
 
Last edited:
In what level are issues with Indiana Jones? So far Vatican is the most vRAM hungry out of the 3 I've experienced (Jungle, Marshall College and Vatican). Turning down settings and restarting the game will start on lower vRAM usage (than just turning down/off settings), stay under 8GB (1080, DLSS Performance, FOV 105 and the rest set manually to low/off - some a bit higher) and runs smooth. Now it will have burst of up to 300-400MB/s disk read and regular 100-200 MB/s, so you better have it on a fast drive. 5800x3D is fully 100% loaded on all 8 cores and 16 threads in Task Manager (around 80+ with Riva Tuner), but the only moment when it stutters is with auto save and perhaps a moment or two with the odd loading/shader/transversal/whatever. It does not sttuter all the time. Worth mentioning than even with all on low, the quality of some textures (in Indiana, of course), still stays rather high visually, perhaps akin to around high/ultra on older games.

Hogwarts is is just Hogwarts, that one stutters mostly on the main castle no matter the hardware. Drop settings to stay under 8GB vRAM and you'll be mostly fine.

CB77 also can can be kept within the 8GB and will be fine.

TAA doesn't really matter, gamers don't care about graphics that much or else they'll buy a (sufficiently) faster hardware. :P
I doubt that the majority of people with 8Gb cards will be rocking an 8 core 16 thread CPU. Sounds like an unbalanced system.
 
And here we go, this could basically be the end of the discussion debate crying arguing on RT vs Raster (and native vs upscaling for that matter) - As Linus says, we have now just lived through RT's infancy and seen the issues, but as of right now, the real gains are being seen and native rendering can essentially be forgotten because it's never going to be a "thing" again.


So you are getting a 5090 soon. As he concludes it is not happening without AI :p
 
In games that use it well its great.. which is like 4-5 games? It's nice in Cyberpunk, Alan Wake, or Metro EE but in others like Resident Evil games, Dead Space or Elden Ring there is no point even turning it on.
I suspect most future RT implementations will be half assed and not worth using (like in most games now) until the next generation of consoles are here.
 
In games that use it well its great.. which is like 4-5 games? It's nice in Cyberpunk, Alan Wake, or Metro EE but in others like Resident Evil games, Dead Space or Elden Ring there is no point even turning it on.
I suspect most future RT implementations will be half assed and not worth using (like in most games now) until the next generation of consoles are here.
I've turned RT on in RE4 remake looks great to me. Still hitting 80fps at 1080p.
 
Yes RT is reflections only in Res Evil, and it makes virtually no difference to framerate as a result.
 
And here we go, this could basically be the end of the discussion debate crying arguing on RT vs Raster (and native vs upscaling for that matter) - As Linus says, we have now just lived through RT's infancy and seen the issues, but as of right now, the real gains are being seen and native rendering can essentially be forgotten because it's never going to be a "thing" again.

I mean he’s not wrong, it’s the new way of the GPU and gaming world - the new norm and one of the many methods now to ‘advance’ visuals and graphical fidelity.
 
Back
Top Bottom