• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
You don't care about it though, else you'd have got an Nvidia card :p


:D

You're still trying? :D I came to AMD from 3 prior generations of Nvidia, they don't impress me.

Nividia wasn't going to serve me better for RT, they are in my price range 'at best' no better, so what i might as well have is the one with the highest raster performance and with that it blows the Nvidia GPU in to the weeds.
 
Last edited:

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
100,888
Location
South Coast
You asked the question, you got an answer. Also raster perf is not really the metric of win you thin it is in 2024., case in point, the entire RT vs screen space effects discussion above. Also I see nothing being blown out of the water unless you only look at a couple of AMD optimised titles like CoD and even then it's not being blown into any weeds.

You don't care about RT, yet here we are where you're constantly trying to convince others as to why you don't care about it by constantly going on about RT in an RT thread :D :D
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
You asked the question, you got an answer. Also raster perf is not really the metric of win you thin it is in 2024., case in point, the entire RT vs screen space effects discussion above. Also I see nothing being blown out of the water unless you only look at a couple of AMD optimised titles like CoD and even then it's not being blown into any weeds.

You're not listening, the RT performance on the Nvidia GPU is 'at best' just as useless as it is on my AMD GPU, be that as it is the raster performance is two tiers above it, easily.

So even i change my thinking to make raster performance completely irrelevant and think only about RT performance Nvidia offer me nothing over what i have. At best.
 
Last edited:
  • Haha
Reactions: mrk
Associate
Joined
28 Sep 2018
Posts
2,296
Right, as is true, still at this point for most cards with RT stickers on them, so why should any of us with RT stickered cards care about RT?

Because as with any tech, the top of the spending tree gets to experience it first. This develops experience, ecosystem maturation, economies of scale eventually brings it down to mass market.

A 55 inch LG oled at launch was over 5k. https://uk.pcmag.com/migrated-54079-tvs/70262/lg-65eg9600

You can now get one sub 1k and it performs better. Tech always gets better it has to start somewhere and people be willing to pay for it. When it comes to RT, many of us are.

This isn’t a new concept…..

And you need competition. LG stagnated with WOLED innovation until Samsung delivered QD OLED. We need the same from amd and Intel on the RT front. Let’s hope they can step up.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
Because as with any tech, the top of the spending tree gets to experience it first. This develops experience, ecosystem maturation, economies of scale eventually brings it down to mass market.

A 55 inch LG oled at launch was over 5k. https://uk.pcmag.com/migrated-54079-tvs/70262/lg-65eg9600

You can now get one sub 1k and it performs better. Tech always gets better it has to start somewhere and people be willing to pay for it. When it comes to RT, many of us are.

This isn’t a new concept…..

And you need competition. LG stagnated with WOLED innovation until Samsung delivered QD OLED. We need to same from amd and Intel on the RT front. Let’s hope they can step up.

Its a new concept for GPU's, one used to be able to use all the highest graphical features with decent frame rates in the mid range.
 
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
The idea is climb the greasy ladder and give us more of your money, some people champion that if they can get some perceive tribal point scoring over the other tribe.

I don't know which part of that is worse but either way its a distraction from what is really going on.
 
Last edited:
Associate
Joined
28 Sep 2018
Posts
2,296
Its a new concept for GPU's, one used to be able to use all the highest graphical features with decent frame rates in the mid range.

it’s not a new concept. AA and AF were similar for many years. AA still brings mid tier and lower gpu’s down today..

What you’re struggling to understand is that gpu tech has been heavily stagnant for decades and relied on selling you high framerates for the same tech.

Now we have new fundamental tech leap with RT.
 
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
it’s not a new concept. AA and AF were similar for many years. AA still brings mid tier and lower gpu’s down today..

What you’re struggling to understand is that gpu tech has been heavily stagnant for decades and relied on selling you high framerates for the same tech.

Now we have new fundamental tech leap with RT.

No it isn't, have you only ever bought high end GPU's? My GTX 1070 could run everything maxed out at the time, so could the 970 before it.
 

G J

G J

Associate
Joined
3 Oct 2008
Posts
1,426
8rnft4.jpg
 
Caporegime
Joined
4 Jun 2009
Posts
31,288
You're not listening, the RT performance on the Nvidia GPU is 'at best' just as useless as it is on my AMD GPU, be that as it is the raster performance is two tiers above it, easily.

So even i change my thinking to make raster performance completely irrelevant and think only about RT performance Nvidia offer me nothing over what i have. At best.

This is true to an extent, I'll give you that when you look at cp 2077 maxed out with no upscaling etc. where fps is say maybe 29 on nvidia and 19 on amd @ 4k but there is a big difference:

- just looking at a single fps metric is not really a great insight into actual perf., in frame latency etc. amd tanks more when multiple rt effects are in use and this is because of how nvidia have dedicated hardware for it, remember the video df did on dying light 2 which first highlighted this on how amd actually can do rather well as long as resolution of rt effects aren't too much (or you reduce the raster based settings to free up more resources for RT), as long as there aren't shadows being cast from every light source, as long as reflections only happen on certain surfaces and so on hence why amd sponsored rt titles to date have cut corners, this is why AMD are probably changing their approach with rdna 4 and I wouldn't be surprised if they go a more dedicated hardware route too
- the upscaling on nvidia is considerably better, remember, you can literally use lower presets of dlss and get sig. more perf AND better IQ than even FSR quality in a number of games
- amd can't play remix titles at all (which isn't surprising given it's made by nvidia) but this is important if you ever do want to play any titles where remix has been used and not just from a perf pov but to avoid graphical artifacts that happens with amd gpus

it’s not a new concept. AA and AF were similar for many years. AA still brings mid tier and lower gpu’s down today..

What you’re struggling to understand is that gpu tech has been heavily stagnant for decades and relied on selling you high framerates for the same tech.

Now we have new fundamental tech leap with RT.

This has actually had me thinking of my previous gpus and before RT and upscaling came along, I usually have always owned mid to high end, more often mid end and for a long time gamed at 60hz, I ended up sacrificing graphical settings far more often and across a variety of areas in order to achieve a locked 60 and then when I went to 144hz display with this era of hardware and games, I also ended up sacrificing even more settings then to achieve closer to 100+ fps, things like AA, shadows, post processing, effects, lighting were always the first to be dropped to usually medium and the rest of the settings with a mix of ultra but more often high. Now with the 3080 and even gaming at a higher res. than back in those days, I haven't had to really sacrifice any graphical settings (ignoring the absolute **** optimised games that came out [more so last year]) including RT settings, obviously upscaling has made this possible but thought this was quite interesting to reflect back on and in some ways, the standard of gaming now is actually rather good.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
This is true to an extent, I'll give you that when you look at cp 2077 maxed out with no upscaling etc. where fps is say maybe 29 on nvidia and 19 on amd @ 4k but there is a big difference:

- just looking at a single fps metric is not really a great insight into actual perf., in frame latency etc. amd tanks more when multiple rt effects are in use and this is because of how nvidia have dedicated hardware for it, remember the video df did on dying light 2 which first highlighted this on how amd actually can do rather well as long as resolution of rt effects aren't too much (or you reduce the raster based settings to free up more resources for RT), as long as there aren't shadows being cast from every light source, as long as reflections only happen on certain surfaces and so on hence why amd sponsored rt titles to date have cut corners, this is why AMD are probably changing their approach with rdna 4 and I wouldn't be surprised if they go a more dedicated hardware route too
- the upscaling on nvidia is considerably better, remember, you can literally use lower presets of dlss and get sig. more perf AND better IQ than even FSR quality in a number of games
- amd can't play remix titles at all (which isn't surprising given it's made by nvidia) but this is important if you ever do want to play any titles where remix has been used and not just from a perf pov but to avoid graphical artifacts that happens with amd gpus



This has actually had me thinking of my previous gpus and before RT and upscaling came along, I usually have always owned mid to high end, more often mid end and for a long time gamed at 60hz, I ended up sacrificing graphical settings far more often and across a variety of areas in order to achieve a locked 60 and then when I went to 144hz display with this era of hardware and games, I also ended up sacrificing even more settings then to achieve closer to 100+ fps, things like AA, shadows, post processing, effects, lighting were always the first to be dropped to usually medium and the rest of the settings with a mix of ultra but more often high. Now with the 3080 and even gaming at a higher res. than back in those days, I haven't had to really sacrifice any graphical settings (ignoring the absolute **** optimised games that came out [more so last year]) including RT settings, obviously upscaling has made this possible but thought this was quite interesting to reflect back on and in some ways, the standard of gaming now is actually rather good.

You're right to say AMD lose more performance relative to raster the harder the RT gets, however, you have to push the RT to unplayable frame rates on Nvidia to produce numbers where Nvidia are significantly ahead, and often this is with GPU's that are more expensive. Pay attention to that <<<< shocker incoming.

This is not true for all Nvidia GPU's, once you get high up the food chain AMD just can't keep up, a 4080S for example is much faster in RT than a 7900 XTX at perfectly playable frame rates, it is £100 more expensive but for the performance difference you're getting in RT its perfectly reasonable. You know.... in the £1000 GPU range.

Ok so lets look lower down the stack and use your example, because anything other than Cyberpunk is kinda fake RT, right? Only Cyberpunk is true RT. So, what do you get for around £450? RTX 4060 Ti vs RX 7800 XT, how do they compare at 1440P?

Oh dear.... so i'm guessing no RT for us? Then why should we care about RT?

wy3QLTs.jpeg


Right fine, let's widen that out something that isn't Cyberpunk, you know all the kinda fake RT games...

Oh no, the narrative, its gone., so i'm guessing its why this slide is kinda fake?

4slNKHR.jpeg
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,288
You're right to say AMD lose more performance relative to raster the harder the RT gets, however, you have to push the RT to unplayable frame rates on Nvidia to produce numbers where Nvidia are significantly ahead, and often this is with GPU's that are more expensive.

This is not true for all Nvidia GPU's, once you get high up the food chain AMD just can't keep up, a 4080S for example is much faster in RT than a 7900 XTX at perfectly playable frame rates, it is £100 more expensive but for the performance difference you're getting in RT its perfectly reasonable. You know.... in the £1000 GPU range.

Ok so lets look lower down the stack and use your example, because anything other than Cyberpunk is kinda fake RT, right? Only Cyberpunk is true RT. So, what do you get for around £450? RTX 4060 Ti vs RX 7800 XT, how do they compare at 1440P?

Oh dear.... so i'm guessing no RT for us? Then why should we care about RT?

Right fine, let widen that out something that isn't Cyberpunk, you know all the kinda fake RT games...

Oh no, the narrative, its gone., so i'm guessing its why this slide is kinda fake?

You ignored key points I made there, the main one being this:

- the upscaling on nvidia is considerably better, remember, you can literally use lower presets of dlss and get sig. more perf AND better IQ than even FSR quality in a number of games

I'm not sure why you insist that people on either side are going to try and run RT maxed out with no upscaling in games such as cp 2077.
 
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
I don't its really a valid argument "but it works if you run the game with 480P render resolution" and the same applies for AMD as well as Nvidia so its not a counter argument at all...

Why does every argument involving Nvidia distil down to DLSS?
 
Last edited:

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
100,888
Location
South Coast
It's a completely valid argument because DLSS upscaling is now often better quality than native as this is what all of the evidence shows, unless you choose to ignore objective evidence and only go with your own view.

The only modern PC release where this currently isn't the case is Ghost of Tsushima where there is some depth of field flickering on DLSS that doesn't exist with FSR, but FSR is still more fizzled whilst XeSS is blurry vs the clean and sharp DLSS - Obviously that's a porting bug as the issue exists on PS5 as well.

1440p upscaling DLSS and above produces better than native texture detail and output image, this is the fact of the matter today and the fact that you never mention this and choose to ignore it and only focus on raster paints a picture of your own narrative to try and dirty tech like RT because it runs poorly at native res on all platforms when the GPU is below a certain threshold.
 
Caporegime
Joined
17 Mar 2012
Posts
48,262
Location
ARC-L1, Stanton System
It's a completely valid argument because DLSS upscaling is now often better quality than native as this is what all of the evidence shows, unless you choose to ignore objective evidence and only go with your own view.

The only modern PC release where this currently isn't the case is Ghost of Tsushima where there is some depth of field flickering on DLSS that doesn't exist with FSR, but FSR is still more fizzled whilst XeSS is blurry vs the clean and sharp DLSS - Obviously that's a porting bug as the issue exists on PS5 as well.

1440p upscaling DLSS and above produces better than native texture detail and output image, this is the fact of the matter today and the fact that you never mention this and choose to ignore it and only focus on raster paints a picture of your own narrative to try and dirty tech like RT because it runs poorly at native res on all platforms when the GPU is below a certain threshold.

And there it is, because DLSS is better than native.

-Jensen Huang.

Well that's it then, a new religious text.
 

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
100,888
Location
South Coast
Well yes, because it has been continually shown to be the case, or are you deciding to just ignore evidence again and again? Not in all games, but the vast majority, and now moreso with the latest DLSS version 3.7 dll file with preset E.

There is no marketing spin on this, it is simply fact, but you will choose to ignore facts just like you have done this entire thread lol.
 
Caporegime
Joined
4 Jun 2009
Posts
31,288
I don't its really a valid argument "but it works if you run the game with 480P render resolution" and the same applies for AMD as well as Nvidia so its not a counter argument at all...

Why does every argument involving Nvidia distil down to DLSS?

You even stated yourself that upscaling does matter..........

I don't enjoy posting this but if we want things to be better, as we all should, i feel like i have to.

I have an RX 7800 XT, i have nothing but good things to say about the GPU, the hardware, its great, AMD drivers.. also great.

With that said AMD's weakpoint has always been FSR compared to DLSS.

Before i start, CIG (Cloud Imperium Games) have developed their own upscaling tech for their game. TSR.
Ok, so, i don't know which version of FSR CIG are using, 1, 2 or 3, i hope its not 1! 1 is bad, so i don't know how fair on AMD this actually is, but, this is bad.... you can clearly see CIG TSR is better, look at the signs, the red text above the Cubby Blast door, see how unstable that is in FSR compared to CIG TSR?

CIG have a better upscaling tech for your GPU's, AMD, are you even trying?

CIG are working with AMD at the moment, for bugs with Vulkan in the driver, but also future performance optimisations with Vulkan, and CIG are using AMD's white paper on Ray Tracing, not Nvidia... thank _____! They have said this was a very deliberate decision.
I hope AMD can do a lot of work with CIG, this game has the potential to be absolutely massive, and i would like AMD to be involved with it.

Upscaling does matter, and you can do so much better, AMD, no more half arsing it. Be as good at this as you know you can.
:)



mrk has already covered the question.

And there it is, because DLSS is better than native.

-Jensen Huang.

Well that's it then, a new religious text.

Computerbase, pcgamershardware, TPU, hardware unboxed, oc3d, DF, gamer nexus, daniel owen have all stated this numerous times and pointed where and when DLSS is better than native (as well as showing at times where native can be better than upscaling tech), of course, I now suspect that all of these tech press are bought by nvidia right? Even though they have side by side comparisons showing evidence to back up their statements....

Well yes, because it has been continually shown to be the case, or are you deciding to just ignore evidence again and again? Not in all games, but the vast majority, and now moreso with the latest DLSS version 3.7 dll file with preset E.

There is no marketing spin on this, it is simply fact, but you will choose to ignore facts just like you have done this entire thread lol.

Just on this bit, it's not always required having to switch presets or/and turn of post processing effects such as DOF, motion blur but regardless I always turn off these things and that's usually the problem when some people point out issues with dlss, it's usually because they are referring to an old version of dlss or/and using post processing effects such as DOF, CA, lens flare etc. which can impact the output quality.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom