• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

8k90rs.jpg
You've spelled Remix wrong :))
 
  • Haha
Reactions: TNA
I've only tried it in a couple of games as I only have a 2070 (had a 3070). Doom Eternal looked good I guess? I turned it on in a few games to see it, but wasn't really "wowed" by it tbh. I'd much prefer the FPS over shiny, but I also tend to play games pretty competitively.
 
RT does look nice. It probably improves the graphics by at least 10%. However, that extra 10% means the difference between spending £200 on a graphics card and spending £600 on a graphics card. Around 3x the amount. I wouldn't even spend 3x the amount decorating my house to get it looking 10% better. I wouldn't spend 3x the amount on a car that looks 10% better either. So why are people happy to spend it on a GPU? I wish more money could be spent on gameplay, storytelling, character development, testing/bug-fixing. I get the impression ray-tracing is a higher priority.

Most gamers are saying that graphics cards are too expensive - Nvidia and AMD are milking us etc. The transition from 1080Ti to 2080Ti was when most of the moaning about prices started. The transition from 1080Ti to 2080Ti is also when raytracing in games started. That is the price you pay for ray-tracing.

So when were you truly happier? When the GTX 10 series was released or now?
 
Last edited:
RT does look nice. It probably improves the graphics by at least 10%. However, that extra 10% means the difference between spending £200 on a graphics card and spending £600 on a graphics card. Around 3x the amount. I wouldn't even spend 3x the amount decorating my house to get it looking 10% better. I wouldn't spend 3x the amount on a car that looks 10% better either. So why are people happy to spend it on a GPU? I wish more money could be spent on gameplay, storytelling, character development, testing/bug-fixing. I get the impression ray-tracing is a higher priority.

Most gamers are saying that graphics cards are too expensive - Nvidia and AMD are milking us etc. The transition from 1080Ti to 2080Ti was when most of the moaning about prices started. The transition from 1080Ti to 2080Ti is also when raytracing in games started. That is the price you pay for ray-tracing.

So when were you truly happier? When the GTX 10 series was released or now?

Now.
 
The problem is for RT to really shine even the 4090 is not enough and they are still lol pounds. Are we really trying to say a 12GB 4070 Ti at almost £600 is a great deal? Crowing over how good those barely even mid range GPUs are priced now? It is barely faster than a 3080 that was £640 over two years earlier. Let that sink in before patting yourself on the back for such a “great deal”. Apologies TNA don’t mean to seem like a dick here, just pointing out some reality as I see it in hindsight.

I have a 4080 and turning on all the RT in CP 2077 to see what I’m missing has not once made me think “wow”. My 4080 cost me £900 used and I still think it was overpriced and that RT is nice but not even close to a game changer.

I also got a 7900 XT at £640 last summer and at that price and time I can just about think it was decent. It certainly puts the used price of my 4080 to shame because I get the exact same experience at 4K in the vast majority of games for a lot less money.

So all in all I don’t think this is a good time for PC gaming because the entry price for the best experience is an absolute joke. The fact we as a PC gaming community are at the point we are lauding mid range GPUs at historically top end prices, is indicative of the problem.
 
Last edited:
The problem is for RT to really shine even the 4090 is not enough and they are still lol pounds. Are we really trying to say a 12GB 4070 Ti at almost £600 is a great deal? Crowing over how good those barely even mid range GPUs are priced now?

I have a 4080 and turning in all the RT in CP 2077 to see what I’m missing has not once made me think “wow”. My 4080 cost me £900 used and I still think it was overpriced.

So all in all I don’t think this is a good time for PC gaming because the entry price for the best experience is an absolute joke. The fact anyone states with a straight face that a 4070 level GPU is a great deal at almost 600 pound is indicative of the problem.

Didn't you also have a 7900xtx? If so, why did you stick with the 4080?
 
The problem is for RT to really shine even the 4090 is not enough and they are still lol pounds. Are we really trying to say a 12GB 4070 Ti at almost £600 is a great deal? Crowing over how good those barely even mid range GPUs are priced now? It is barely faster than a 3080 that was £640 over two years earlier. Let that sink in before patting yourself on the back for such a “great deal”. Apologies TNA don’t mean to seem like a dick here, just pointing out some reality as I see it in hindsight.

Haha. No problem.

To be clear I am saying at £575 it is a great deal in the current market. You can find many posts of mine slamming current pricing from both Nvidia and AMD.

The reality is, it is what it is.
 
The 4090 is enough for ray tracing at up to 4K. This is proven by... every game (that isn't a poor PC release) with RT on a 4090. Frame Gen is not needed either here.

It is not enough for path tracing at 4K as the baseline pre-frame gen fps is too low to make a higher than 60fps experience not feel like there is mouse latency at this res. At 1440p or ultrawide 3440x1440 though a 4090 is more than enough for path tracing at with all bells and whistles turned on and get over a low latency 100fps+ with Frame Gen enabled.
 
Last edited:
Sometimes it's just straight awful unfortunately. This can't help how (poorly) it's perceived.


People really need to get their eyes tested. The difference is noticeable and way better with RT. On my 3080 at 3440x1440, fps goes from 138 to 80/90 with RT maxed in diablo 4. Also, in that video, he is using DLAA, not DLSS, DLSS provides a better than native image in diablo 4.

The fact that people aren't noticing all the raster artifacts/issues and calling Horizon forbidden west one of best visual experience ever shows how bad peoples eyesight is or rather they simply don't know what to look for and are too accustomed to raster.

e.g.




And that's not even the worst bits :cry: It's like seeing all the glue/strings that hold a scene together when raster methods are used.

Avatar poos all over horizon forbidden west.
 
Last edited:
Aside from adding nothing notable to the visuals, the 4080 is barely cracking 40fps with RT enabled at times. I can't imagine anyone in their right mind choosing to play a game like Diablo that way.

With DLSS instead of DLAA, it would be more than playable and also a better experience than native:

In Diablo IV, both DLSS and FSR have a real bright appearance, because both techniques surround the usual problems with full utilization of the advantages to be expected. As a result, DLSS and FSR can work out the details somewhat more finely in the setting "Quality" in one or the other object compared to the native resolution and show the identical image sharpness at higher FPS apart from that. It only with the performance preset does it fall minimally until slightly, but remains at a high level.

The image stability is also better with DLSS and FSR than with native resolution. No matter whether Ultra HD or Full HD is the target resolution or whether the quality or performance mode is used: the temporal upsampling of AMD and Nvidia delivers a calmer image as the game-own TAA. In UHD, the result is to be described as quasi flicker-free in both cases.

Already compared to the native resolution, DLSS and FSR perform very well, comparing the same render resolution (native vs. Target resolution) are then far superior to both technologies: Ultra HD with FSR/DLSS rendered to “performance” (i.e. rendered before upsampling in Full HD) looks much better than the native Full HD resolution, which is blurred and flickers beyond. Even against the native WQHD resolution, upsampling is better.

 
The fps is because of DLAA being used with 4K/poor RT, if DLSS was used (assuming latest dll swapped in) then the fps would be much higher with virtually no change in image quality, especially on a game like that. The game also implements RT reflections and shadows poorly since these are two of the easiest things to render without much of a performance hit. For a proper look at game that uses all these RT effects to good effect as is similarly themed check out Witcher 3 Next Gen. It's only RTGI that has the big fps impact.

If the game had RTGI then I could understand that kind of fps hit, but the only logical answer is that it was just splashed in as a passing thought as opposed to implemented with optimisation in mind.
 
Last edited:
The fps is because of DLAA being used with RT, if DLSS was used (assuming latest dll swapped in) then the fps would be much higher with virtually no change in image quality, especially on a game like that. The game also implements RT reflections and shadows poorly since these are two of the easiest things to render without much of a performance hit. For a proper look at game that uses all these RT effects to good effect as is similarly themed check out Witcher 3 Next Gen. It's only RTGI that has the big fps impact.

If the game had RTGI then I could understand that kind of fps hit, but the only logical answer is that it was just splashed in as a passing thought as opposed to implemented with optimisation in mind.
As above, IQ is better with DLSS and even with FSR.

I've played a bit and the game is noticeably better with RT on, the games raster implementation was already very good though, which isn't surprising given the style of game it is. It reminds me of riftbreaker somewhat (which is one of my fav games for RT) but yes, generally RT isn't going to be as noticeable in such games whe compared to games like cp 2077 and so on for a number of reasons.

Also, just a note, RT GI doesn't always kill performance e.g. see the finals:

 
The Finals is one exception as the RTXGI used is probe based, so you still get some of the quirks of screen space effects like light leakage. DF's video on the tech behind the Finals from months ago showed that really well, there's also a long latency from cause and effect, so it's saving on a big fps cost by cutting those corners to keep the fps high even with the GI enabled^^
 
People really need to get their eyes tested. The difference is noticeable and way better with RT. On my 3080 at 3440x1440, fps goes from 138 to 80/90 with RT maxed in diablo 4. Also, in that video, he is using DLAA, not DLSS, DLSS provides a better than native image in diablo 4.

The fact that people aren't noticing all the raster artifacts/issues and calling Horizon forbidden west one of best visual experience ever shows how bad peoples eyesight is or rather they simply don't know what to look for and are too accustomed to raster.

e.g.




And that's not even the worst bits :cry: It's like seeing all the glue/strings that hold a scene together when raster methods are used.

Avatar poos all over horizon forbidden west.
I clicked at 11 to 12 second on Diablo 4 video repeatedly and in some ways I need to concede that RT has some visual benefits with the water but some things not as good.
 
Back
Top Bottom