• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 4 thread

The 6950XT can be tamed very easily to around 260-280 watts with next to no performance loss or 200 with around 8-10% performance loss. The reason why I would recommend the 6950XT is when it is around 25-30% cheaper than the 7900XT which was the case when I bought mine. The price has to make sense of course and it seems pricing differs a lot depending on region. In my country the 7900XT was still a lot more expensive last I looked.

Yes, the release price on 7900 XT made the 6950 XT EOL prices very good value.
 
Lots of BS rumours flying around lately, like AMD retreating from the high end market and only going for Midrange stuff. No solid information on it, just rumours bouncing from one person to another.

I'll believe it when AMD actually announce it, otherwise its all hearsay.

I tend to believe it is true but the same is true right now to be fair. They are not competing at the high end 4090 and it's only the upper mid range and down that they compete (and beat IMHO) Nvidia.

So if we end up with RDNA 4 simply being a 7900 XTX with better RT and better power efficiency as well as lower price, then I would call that a win. Each time AMD compete at the high end, they still lose market share. So AMD are just looking at "why bother", I mean you read Poneros' mind numbingly uninformed post a few posts back? The one about Nvidia still having GPUs that are faster, more efficient AND with more & better features. It took all of 30 seconds to debunk than crap fest, yet this is the level of cult Nvidia that AMD have to try and get through too.
 
Last edited:
And that makes the 4080 better how?
It doesn’t make it better due to the daft pricing but shows just how far Nvidia are ahead in both RT and raster and that they could crush AMD if they wanted to. AMD are now back to where they were with RDNA1 where the 5700/XT were competing with Nvidia’s 70, the difference now is that names have changed and prices have doubled.
 
It doesn’t make it better due to the daft pricing but shows just how far Nvidia are ahead in both RT and raster and that they could crush AMD if they wanted to. AMD are now back to where they were with RDNA1 where the 5700/XT were competing with Nvidia’s 70, the difference now is that names have changed and prices have doubled.
The 7900XT is a 7800XT with 30-40% more uplift than last gen and you can get it for £699 now- I think that’s a good deal. You couldn’t actually get a 6800XT in the UK for less than £800 2.5 years ago.

What do you want AMD to do? Release a 7900xt for £450 and XTX for £600 to entice you away from Nvidia?

You literally want them to give it away with no intention of purchasing their product to make Nvidia drop their prices lol

You are in cloud land mate.
 
Last edited:
It doesn’t make it better due to the daft pricing but shows just how far Nvidia are ahead in both RT and raster and that they could crush AMD if they wanted to. AMD are now back to where they were with RDNA1 where the 5700/XT were competing with Nvidia’s 70, the difference now is that names have changed and prices have doubled.

You keep posting this but only ever look at actual specs rather than the price/perf. It doesn't matter how technically good Nvidia is compared to AMD if price vs price the AMD GPU is better value. If Nvidia are just going to keep pricing the same way they are, then you need to suck it up. We all know the only reason YOU want AMD to drop prices is to force Nvidia to sell you a 4080 at £700. Well good luck with that because Nvidia aren't budging on price. From now on your £600 will get you a xx60 series GPU and you will like it :D

AMD moved up one tier (6800 > 7900) on price and perf
Nvidia moved two tiers. (3080 > 4080) on price and perf
 
Last edited:
The 7900XT is a 7800XT with 30-40% more uplift than last gen and you can get it for £699 now- I think that’s a good deal. You couldn’t actually get a 6800XT in the UK for less than £800 2.5 years ago.

What do you want AMD to do? Release a 7900xt for £450 and XTX for £600 to entice you away from Nvidia?

You literally want them to give it away with no intention of purchasing their product to make Nvidia drop their prices lol

Something like that it would take to flip the market I'd wager

The logic here is the 7900xtx is a 80 series competitor and the 80 series like the rtx2080 and 3080 were sold at $699, ignore the current 4080 pricing it's too high and generates a lot less sales than the 3080 because of it.

So now we say the 7900xtx can be $699 like the 80 series was and should be. But then Nvidia fans still won't buy it because they will say Nvidia has more features and better RT. So you need to cut the price a bit more to $599 or $649. At $599 the 7900xtx would start to make Nvidia owners think very hard about switching
 
Last edited:
It doesn’t make it better due to the daft pricing but shows just how far Nvidia are ahead in both RT and raster and that they could crush AMD if they wanted to. AMD are now back to where they were with RDNA1 where the 5700/XT were competing with Nvidia’s 70, the difference now is that names have changed and prices have doubled.

When you have 90% marketshare and you can charge 20% more for a similar product you have a lot more money for R&D.

If the rumours of AMD droping out of the high end, again, then that's why, they don't have the money to keep up.
 
When you have 90% marketshare and you can charge 20% more for a similar product you have a lot more money for R&D.

If the rumours of AMD droping out of the high end, again, then that's why, they don't have the money to keep up.
Sometimes it’s best to accept that you can’t keep up and stick to what you’re good at- DIY CPU and Enterprise CPU, console/APU and mid range DIY GPU markets.

Nvidia has none of those except diy GPU and Enterprise machine learning/AI
 
Last edited:
When you have 90% marketshare and you can charge 20% more for a similar product you have a lot more money for R&D.

If the rumours of AMD droping out of the high end, again, then that's why, they don't have the money to keep up.


I don't think money is as nearly a big of a factor as people think. Intel has far more RnD and cash it can use for RnD yet it's graphics card can't come close to AMD's so the correlation between how much money you have to do things like RnD and the performance of your GPU isn't 1:1 correlation

And according to MLID amd isn't dropping out of high end, it's just that there won't be any GPUs using rdna4 that compete with the Nvidia 90 series card next Gen, possibly 80 series too - but that's not because AMD is dropping but more so because the haven't being able to resolve issues with chiplets to scale up to more powerful GPUs. And that's a temporary issue until all the issues with chiplets are sorted out. Essentially related to things AMD has previously disclosed, such as that interview with PC world a while back where the AMD guy one of the issues they have is that the bandwidth of the interconnect between the chiplet dies needs to become 1000 times faster than it is on rdna3 in order to keep making faster GPUs for gaming using chiplets

It may be that by rdna5 AMD has come up with new solutions for chiplet GPUs and they come back with a beast or maybe they switch back to monolithic GPUs but either way AMD isn't choosing to stop making high end GPUs, that's what the leakers say anyway
 
Last edited:
Sometimes it’s best to accept that you can’t keep up and stick to what you’re good at- DIY CPU and Enterprise CPU, console/APU and mid range DIY GPU markets.

Nvidia has none of those except diy GPU and Enterprise machine learning/AI

The RX 580 was one of AMD's most successful GPU's, it didn't cost the earth to R&D, it was inexpensive to make, it provided the performance most people wanted at a low cost.

That is what AMD are good at, they struggle to make ##90 class competitors, i have no doubt they could do it if they threw all the money in the world at the problem, its money they will never recover because they can't sell enough of them, even at a 50% price reduction vs those ##99 class cards.

I don't think money is as nearly a big of a factor as people think. Intel has far more RnD and cash it can use for RnD yet it's graphics card can't come close to AMD's so the correlation between how much money you have to do things like RnD and the performance of your GPU isn't 1:1 correlation

And according to MLID amd isn't dropping out of high end, it's just that there won't be any GPUs using rdna4 that compete with the Nvidia 90 series card next Gen, possibly 80 series too - but that's not because AMD is dropping but more so because the haven't being able to resolve issues with chiplets to scale up to more powerful GPUs. And that's a temporary issue until all the issues with chiplets are sorted out. Essentially related to things AMD has previously disclosed, such as that interview with PC world a while back where the AMD guy one of the issues they have is that the bandwidth of the interconnect between the chiplet dies needs to become 1000 times faster than it is on rdna3 in order to keep making faster GPUs for gaming using chiplets

AMD has more cash than Intel, they just don't have anything like as much cash as Nvidia.

Edit: Intel borrow money every quarter to stay afloat, AMD don't, they don't borrow at all, they don't need to.
 
Last edited:
Imagine the coil whine with a 3.5ghz clock...
**** the whine, imagine the heat!!!!!!!!


E2AOFVK.gif
 
Are you comparing the same tier and or price points? For exmaple 4070Ti vs 7900 XT.
Yes
Price here in UK right now. 7900 XT is 10% cheaper.
4070Ti £770
7900 XT £700
Cool, 10%. Now let's see what 10% gives us:
- Efficiency (you say 10%, let's go with that, though in fact thanks to DLSS2&3 and with a frame cap Lovelace has a HUGE advantage over RDNA3 - which actually regressed on this front majorly, because RDNA2 actually won over Ampere). Being generous, let's say that adds as only a 50w advantage for 4hrs a day of gaming, for the 4070 Ti (between better idle/video & during gameplay numbers), so that goes to around £44 for the 2 year cycle. Ofc in reality people keep such GPUs more like 4 years (two gens) and end up playing more hours, or at least will end up having more accumulated consumption as a result of idle PC or using it for light tasks and that also has a further advantage etc.

- DLSS & DLAA, so let's put the DLSS advantage at 10% performance boost (relative to image quality; in reality, particularly below 4K Quality, it would trounce FSR even harder), then DLSS exists in many more games, and DLAA is a super quality option that simply has no rival from AMD.

- DLSS 3, which has more limited use now but offers you an option simply unavailable to AMD AND goes past bottlenecks which are sometimes undefeatable by more raw grunt (so even future AMD GPUs will not catch up here, f.ex. in CPU limited scenarios, particularly as a result of software, like what we've seen in Jedi Survivor)

- Actually usable raytracing performance (esp. for pathtracing) seeing up to TWICE the performance.

- Some games/projects only have RT/PT on Nvidia and not on AMD/Intel (think of all the pathtraced mods for oldies like Serious Sam etc.), and hell even big AAA projects saw RT completely absent for AMD at launch (it took months for me to finally try RT in CP2077 on my 6800, and now we saw even Ratchet & Clank skip it for AMD at launch).

- Niche older features, and I add this because I personally care, such as Ansel (a great (can do some insane things like super hi-res screenshots) and the only photo mode available in some games, like The Surge etc.), PhysX, HFTS in Division 1, VXAO in RotR, etc.

The power efficiency is is maybe 10% at best in real terms gaming at 4K.
Hardly a massive difference considering the 7900 XT has 8GB more VRAM to power
And that's its only selling point other than the discount, but then ofc you sell a worse product for less - you have to. Clearly they thought better of it at the start but then got smacked back to reality and had to do serious discounts.

and is a faster GPU.
Nope.
Features are… well fake frames.
Fake frames, but real differences. More importantly, with Nvidia you can choose if you care, with AMD you have to pretend not to.

Because AMD can do RT
Sometimes. But never when it you need it the most.

and when you compare tier vs tier the “wins” for Nvidia AMD are a joke.
FTFY. A 7900 XT has its best wins in stuff like COD, where it's 25% faster than the 4070 Ti, but the 4070 Ti is already >120 fps (1440p). The kind of wins the 4070 Ti has on the other hand is in stuff like CP2077 Overdrive, where it can do 60 fps w/ DLSS P+FG, while the 7900 XT is struggling to crawl above a 14 fps with FSR P (at 4K). Such pathetic results can't even be discussed.

Ooooh Nvidia is 40% faster in extreme RT games like CP2077.
142% (not including DLSS & FG advantage, then 300%+). Also, more importantly, playable on Nvidia compared to unplayable on AMD. Which really makes the gain closer to infinite. :p

4070Ti at 4K ultra - 23 FPS
7900 XT ultra - 17 FPS
That's a 35% lead. Better than the 7900 XT's best win.

Both need up-scaling
Ok? Nvidia has the clearly better upscaling. One more win for the 4070 Ti!

and reduced settings to become playable in extreme RT
That's an oxymoron.
and when you add upscaling that lead for the 4070Ti drops from 40% to about 12% or 15%.
Wrong, it increases, because the upscaling is better, and the lower the mode or resolution, the bigger the lead.

But in normal RT games you would need to enable the FPS counter to tell the difference.
Oh great, "normal RT games" now is it? I love semantics.

In raster games the 7900 XT turns the tables and is in general ~15% faster and comes with a lot more VRAM.
Actually, it's more like <10% sadly, and that's not accounting for the DLSS advantage (even in non-FG).
So it’s not just the 6700XT that makes sense because a 7900 XT is cheaper than the 4070Ti, is faster in the majority of games, has a smilar power efficiency per frame and has much more VRAM. Then you can also add the fact it has a far more modern and feature packed control panel software suite.

I think I typed a lot of words to say, you are talking crap.
 
Part 2, because apparently the forum won't let me post otherwise.

So it’s not just the 6700XT that makes sense because a 7900 XT is cheaper than the 4070Ti,
Cheaper, but barely, and not in the long run.
is faster in the majority of games,
False.

has a smilar power efficiency per frame
Worse.

and has much more VRAM.
True, but not relevant atm.

Then you can also add the fact it has a far more modern and feature packed control panel software suite.
Slicker, for sure, but functionally it doesn't have an advantage and certainly not feature-wise.

I think I typed a lot of words to say, you are talking crap.
That's ok, I wish it were crap too. Nonetheless, arguing on the internet is fun so it's worth it in the end.

200w.gif
 
Last edited:
Back
Top Bottom