• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

The 7900XTX is simillar in RT to a 3090. Yes I know the Ada cards are ahead of it, but is the 3090 seen as a BAD RT card now?

No of course not. Though the thing to remember is they need to be seen to be improving. So getting to current Ada 4070Ti 4080 in RT is a necessity to be seen as staying relevant.
 
No of course not. Though the thing to remember is they need to be seen to be improving. So getting to current Ada 4070Ti 4080 in RT is a necessity to be seen as staying relevant.
I understand that but the way people talk is like AMD don't and can't do RT at all and it's pointless to consider AMD for RT in any capacity.

It just seems massively overblown to me. If you're a huge advocate of RT and need it then sure, buy a 4080 or 4090. If you're looking at it as a nice extra then a 7900XT/X will do fine too.
 
I understand that but the way people talk is like AMD don't and can't do RT at all and it's pointless to consider AMD for RT in any capacity.

It just seems massively overblown to me. If you're a huge advocate of RT and need it then sure, buy a 4080 or 4090. If you're looking at it as a nice extra then a 7900XT/X will do fine too.

Personally I'm fine with no RT and everything done via raster, Apart from RT reflections, They're just smexy !

But AMD need their RT performance to be a lot better than what it currently is, Which right now is decent, But it needs to be better if they are to continue to compete.
 
Last edited:
The issue is not that they make good products, I don't think anyone can say otherwise. The issue is how they got there in the ability to now spend billions on R&D every year... paying various computer companies like Dell, IBM, Lenovo, Asus etc... to not have AMD GPU's in their systems and laptops which is now well documented, Various anti competitive shady practices is in part how they grew their marketshare with various professionals in the industry now speaking up against them, It's why the US courts just issued a subpoena against Nvidia for a laundry list of shady practices.

AMD have a LOT of shortfalls that unless they do a 360 will lead to them dropping out of the GPU business, I don't defend any company as that's the sign of someone not worth listening to but many on this forum, And many others, Like to brush Nvidia's very shady road to success under the carpet and then base their entire online persona around a company that couldn't give a rats rear end if they live or die.

Used to care about that before and hence preferred AMD. But as you get older you just get to a point where it's like who gives ****. AMD don't give a ****, Nvidia certainly doesn't, why should I?

I just buy what I fancy and say it how I see it.
 
Personally I' fine with no RT and everything done via raster, Apart from RT reflections, They're just smexy !

But AMD need their RT performance to be a lot better than what it currently is, which right now is decent, But it needs to be better if they are to continue to compete.
I agree they need to improve, Nvidia will almost certainly improve too but it's not like it's impossible to use RT on AMD cards like a lot of people seem to believe.
 
I understand that but the way people talk is like AMD don't and can't do RT at all and it's pointless to consider AMD for RT in any capacity.

It just seems massively overblown to me. If you're a huge advocate of RT and need it then sure, buy a 4080 or 4090. If you're looking at it as a nice extra then a 7900XT/X will do fine too.

Yes I agree, though if AMD stagnate while Nvidia keep improving RT performance then AMD become irrelevant. The level of RT that is acceptable will keep moving, AMD need to keep moving with it.
 
Last edited:
Yes I agree, though if AMD stagnate while Nvidia keep improving RT performance then AMD become irrelevant. The level of RT that is acceptable will keep moving.
100%. I don't think they will stagnante though. RDNA3 was a significant improvement over RDNA2 so there's no reason not to think RDNA4 will be another improvement.

I just fully expect that even if AMD somehow match the 4090 for RT with RDNA4 it'll be irrelevant again because the 5090 is faster.
 
Wow, What a load of self righteous drivel.

Pot meet kettle. You, Gpueurilla and ICDP are just as bad. You guys are front and centre in the GPU tribalism crap that Dicehunter was talking about.

Laying the smack down :D

the-rock-rock-bottom.gif
 
100%. I don't think they will stagnante though. RDNA3 was a significant improvement over RDNA2 so there's no reason not to think RDNA4 will be another improvement.

I just fully expect that even if AMD somehow match the 4090 for RT with RDNA4 it'll be irrelevant again because the 5090 is faster.

Apologies I thought you were saying the 3090 level of RT is good enough to stay at.
 
Apologies I thought you were saying the 3090 level of RT is good enough to stay at.
No worries mate, I am under no illusion that they are fine where they are, I just don't like that AMD cards are seen as pointless for RT because they're not the fastest.

Any progress is good progress and progress is made most when there's competition.
 
Problem is it's not just RT alone now like it used to be, we now have addons/extensions that improve RT, Ray Reconstruction, ReSTIR GI, RTXDI, SER etc. The first one is especially important as it makes ray traced denoising go from noisy to clean which is most obvious in unreal engine 5 now due to the way Lumen works. On top of that Nvidia helps any dev and provides all the tools free to add this stuff into their game engines, on the flipside AMD have been said to just go here is FSR3, it's open source, you can make it work" or words to that effect based on past dev chatter online and interviews etc.

This is currently only possible on any RTX card, not just the 40 series, So yes whilst a 7900XTX can match a 3080 Ti for base RT, it can't match it for clean RT due to the above. AMD need to pull their finger out and come up with a similar method to do the same thing, perhaps even use the same thing using their own RT hardware.
 
Last edited:
Problem is it's not just RT alone now like it used to be, we now have addons/extensions that improve RT, Ray Reconstruction, ReSTIR GI, RTXDI, SER etc. The first one is especially important as it makes ray traced denoising go from noisy to clean which is most obvious in unreal engine 5 now due to the way Lumen works.

This is currently only possible on any RTX card, not just the 40 series, So yes whilst a 7900XTX can match a 3080 Ti for base RT, it can't match it for clean RT due to the above. AMD need to pull their finger out and come up with a similar method to do the same thing, perhaps even use the same thing using their own RT hardware.

Issue with that is I highly doubt all the developers of Nvidia sponsored RT heavy titles will go back and patch their games to work with an AMD version if AMD got their own out or maybe even some type of translation layer so they could run it.
 
No worries mate, I am under no illusion that they are fine where they are, I just don't like that AMD cards are seen as pointless for RT because they're not the fastest.

Any progress is good progress and progress is made most when there's competition.

Yes I know what you mean. It’s like the people who go Nvidia because they are better at RT, and end up with a 4060Ti 8GB. This is despite it being worse than a 7700 XT in almost every way apart from access to DLSS and being marginally faster in heavy RT games. Thought considering both are practically worthless in RT that’s a low bar.

Though most reviewers will acknowledge that AMD are the best bang for buck at all tiers below a 7900 GRE.

Mind share is a big reason for this disparity.
 
Last edited:
The issue is not that they make good products, I don't think anyone can say otherwise. The issue is how they got there in the ability to now spend billions on R&D every year... paying various computer companies like Dell, IBM, Lenovo, Asus etc... to not have AMD GPU's in their systems and laptops which is now well documented, Various anti competitive shady practices is in part how they grew their marketshare with various professionals in the industry now speaking up against them, It's why the US courts just issued a subpoena against Nvidia for a laundry list of shady practices.

AMD have a LOT of shortfalls that unless they do a 360 will lead to them dropping out of the GPU business, I don't defend any company as that's the sign of someone not worth listening to but many on this forum, And many others, Like to brush Nvidia's very shady road to success under the carpet and then base their entire online persona around a company that couldn't give a rats rear end if they live or die.
I get this, but I'd be surprised if any companies the size of Nvidia, Intel and AMD were completely clean of stuff like this and throwing their weight around when they can to gain an advantage. You can't really avoid these companies if you want to buy modern things. So it seems pointless to start drawing arbitrary lines in the sand. I mean if there was human trafficking or some such that would be different.

Personally I'm fine with no RT and everything done via raster, Apart from RT reflections, They're just smexy !

But AMD need their RT performance to be a lot better than what it currently is, Which right now is decent, But it needs to be better if they are to continue to compete.
To be fair if either side's RT was decent surely people wouldn't be saying "It's not worth the performance hit" as there wouldn't be a sizeable performance hit. I think there's still a long way to go for all of them. Just some might be a little closer than others. But then where's the incentive to improve if the internet is filled with people saying they're not bothered about RT and will just turn it off?

Issue with that is I highly doubt all the developers of Nvidia sponsored RT heavy titles will go back and patch their games to work with an AMD version if AMD got their own out or maybe even some type of translation layer so they could run it.
Maybe if AMD would get stuff out first rather than following behind Nvidia by a year or so they wouldn't need to convert things over to AMD? So many things lately that Nvidia do first and then a while later AMD release something similar.
AMD released Mantle and Nvidia didn't have anything like it. Now admittedly AMD gave up on Mantle, but even in those early stages some companies were implementing it. Would there have been the interest in Mantle that there was if 18 months earlier Nvidia had released a superior low level API?
 
Problem is it's not just RT alone now like it used to be, we now have addons/extensions that improve RT, Ray Reconstruction, ReSTIR GI, RTXDI, SER etc. The first one is especially important as it makes ray traced denoising go from noisy to clean which is most obvious in unreal engine 5 now due to the way Lumen works. On top of that Nvidia helps any dev and provides all the tools free to add this stuff into their game engines, on the flipside AMD have been said to just go here is FSR3, it's open source, you can make it work" or words to that effect based on past dev chatter online and interviews etc.

This is currently only possible on any RTX card, not just the 40 series, So yes whilst a 7900XTX can match a 3080 Ti for base RT, it can't match it for clean RT due to the above. AMD need to pull their finger out and come up with a similar method to do the same thing, perhaps even use the same thing using their own RT hardware.
I can't argue this, but honestly visuals in RT (without upscaling or anything like that taken into account) seems very similar in any reviews or comparisons I have seen.

I won't ever argue that Nvidia is RT king but I still believe AMD cards are worth considering if it's not the single most important thing to you.
 
Yes I know what you mean. It’s like the people who go Nvidia because they are better at RT, and end up with a 4060Ti 8GB. This is despite it being worse than a 7700 XT in almost every way apart from access to DLSS and being marginally faster in heavy RT games. Thought considering both are practically worthless in RT that’s a low bar.

Though most reviewers will acknowledge that AMD are the best bang for buck at all tiers below a 7900 GRE.

Mind share is a big reason for this disparity.
ANYTHING nVidia does are Better. What did MRK, just said - tools /software etc. You ever heard about those. From your POV, or should I say "Delusion" you are living in, answer is clearly no...!
Sad to say it, but AMD just sucks. Our open sourse yada yada - here pleaaase dev take it, and make it work. You just have to do ALL the work yourself, as We Ourselvs, dont knows how it works!

You AMdu...oh my!!
 
The 7900XTX is simillar in RT to a 3090. Yes I know the Ada cards are ahead of it, but is the 3090 seen as a BAD RT card now?

Again no one has ever said that. The problem with this is that it is competing with a 4 year old gpu and sometimes even losing to a 4 year old gpu in RT games and yet this current gen gpu of amds when released cost £1100+? That is the problem..... We then have the issues where sometimes there seems to be certain titles that don't get RT support on amd due to issues for whatever reason, later then gets fixed by devs or/and amd in a patch. Also, if you want to play any RTX remix games, amd is a no here due to the graphical artifacts and lack of any performance optimisation and then of course, you have what mrk pointed out around ray reconstruction.

i.e. essentially amd need to competing in the here and now, not with 4 year old tech.
 
Last edited:
Back
Top Bottom