• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

Imagine if AMD launched 1-2 months before Nvidia, with a 7900XT being the 7900XTX for £850 and a 7800XT being the 7900XT for $650. They would have sold like hot cakes and been OOS for months, who cares what Nvidia was going to do or price their product. A company that always waits for their competitor to move first has zero confidence in their own product, so why should consumers have confidence in them?...

From memory I dont think they always launched second (including ATi era)..? But I get your point.
 
From memory I dont think they always launched second (including ATi era)..? But I get your point.

They launch after Nvidia now to maximise pricing.

Although the pricing for the products of either company has stopped me upgrading. I just cannot justify spending around £700 for a midrange card and the current details on the 7800XT model doesn't look much better than the 6800XT. Worst case I was hoping for at least 6800XT performance with significantly better efficiency but the specs are not suggesting that is the case if 800w is the PSU requirement
:(

Its a joke - it looks rubbish because AMD has pushed up Navi 32 to the RX7800XT. It should be the RX7700XT. It's what Nvidia did with AD104,ie,the RTX4070.

The only way the RX7800XT looks even a bit OK is if the core clocks close to 3GHZ,so it can clear the RX6800XT easily. But what I think they will do is sell it as RX6800XT level performance at an RX6800 level price. The RX7800XT is really a sub £500 card tarted up to look more expensive.

It makes me wonder whether both are having an informal agreement:

The RTX4070 should have been the RTX4060TI at the very minimum and the RX7800XT should have been the RX7700XT. Now look at the performance increase:

The RTX4060TI would have been 45% faster than an RTX3060TI and had a 50% jump in VRAM. The RX7700XT(at RX6800XT level performance) would be 43% faster.

The generational uplift is there,but they want you to pay more for less. It's shrinkflation.
 
Last edited:
If dlss looks better because it's using more transistors as you say, then why does XeSS running on amd also look better than FSR?
XeSS running on non-Arc cards has a much smaller performance uplift though, both compared to XeSS running on Arc or FSR/DLSS. It does often offer better image quality than FSR, but it also struggles to do its job of increasing the frame rate. Even Balanced XeSS performs worse than Quality FSR/DLSS on AMD/Nvidia cards. It's fine if you only need a little bump to stabilise at a locked 60fps or something, but it's not much use for transforming your experience the way FSR/DLSS often can. It's a trade-off at best and useful to a smaller selection of users than FSR, which can really help people on older cards be able to run new games at any sort of acceptable level. XeSS is also useless for RDNA 1 and older on the AMD side, since they don't support DP4a instructions and it performs worse than native in its INT24 fallback mode. It wouldn't look good for AMD if their own upscaling technology was a disaster on their own hardware from 2019.
 
Never mind all the renaming of tiers.
Look at these leaked die shots of all RDNA3:
As6MiBW.jpg

Note how 7900 GRE's die looks a fair bit different to the full Navi31.
Ignoring the monolith, we were wondering where the cost savings for going multi-die were going.

We knew it wasn't to us consumers.

We assumed it was going to AMD's coffers.

However if GRE shows us how they can take a 384-bit chip and sell it as a pin compatible (that's one of the GRE rumours that it fits in Navi21 PCBs) with a previous 256-bit chip, then that the multi-die really seems to be about something else.

I think this is AMD finding a way to continue to sell negligible volumes but being able to afford them because the can re-use the design.

Reduce and re-use is fine, but ultimately AMD's strategy seem to be to not bother but still turn out some parts at high (per unit) margins, overall profits, volume, marketshare? Who cares.

Yes, in the grand scheme of things even for Ryzen a monolith has advantages when the dies are not crazily large. That is when going chiplet is about design costs and re-usability and less about die yields.

And if your intention is to almost purposefully go for low volumes, then going all out chiplet makes sense.
 
One has to ask why is AMD using Navi 32 as the RX7800XT,which has less shaders than the previous generation Navi 21? Why not use a cutdown Navi 31?

Who is "forcing" AMD to follow Nvidia in jacking up midrange chips to high level pricing? Navi 22 used 335MM2 of 7NM silicon. Navi 32 only uses 200MM2 of 5NM silicon and each MCD is 37.5MM2. AMD could easily make something as fast as an RTX4070 for well under £500.People keep forgetting the RTX4070 was 40~45% faster than an RTX3060TI for 60% more money.It should have been a card well under £450.

Instead they price way too near to Nvidia,so Nvidia's "value added features" become a consideration. But they might give you some more VRAM. So another technical victory.

Its like saying one energy provider being 2% cheaper than the other is a deal when energy prices have been jacked up massively already.

I'm going to assume the rumours that AMD though RDNA3 was faster per core than it actually is, the reason its taken so long to release is because they tried to get the performance back through drivers.

Or, they just thought #### it lets make a crap 7800XT because reasons.
 
XeSS running on non-Arc cards has a much smaller performance uplift though, both compared to XeSS running on Arc or FSR/DLSS. It does often offer better image quality than FSR, but it also struggles to do its job of increasing the frame rate. Even Balanced XeSS performs worse than Quality FSR/DLSS on AMD/Nvidia cards. It's fine if you only need a little bump to stabilise at a locked 60fps or something, but it's not much use for transforming your experience the way FSR/DLSS often can. It's a trade-off at best and useful to a smaller selection of users than FSR, which can really help people on older cards be able to run new games at any sort of acceptable level. XeSS is also useless for RDNA 1 and older on the AMD side, since they don't support DP4a instructions and it performs worse than native in its INT24 fallback mode. It wouldn't look good for AMD if their own upscaling technology was a disaster on their own hardware from 2019.

In that case its probably running a higher base resolution.
 

The RX7800XT will use Navi 32,which is the replacement for Navi 22 used in the RX6700XT. The RX7800XT will have only 60CUs against the RX6800XT which has 72CUs because its made from the larger Navi21 chip. It has the same number of CUs as the RX6800 non-XT. The RX7800XT appears to have a boost clock speed around 400MHZ~450MHZ higher than an RX6800,which is around 20% higher. So unless the RX7800XT can boost to around 3GHZ,it looks like RX6800XT level performance!

I really hope it is faster than an RX6800XT by a decent amount. All I can see them do is price it lower than the RX6800XT,and close to an RTX4070 and have a technical victory!They can spin its 20% faster than an RX6800 and cheaper than the RX6800XT! :(

Basically AMD have done an Nvidia and pushed the RX6700XT replacement one tier upwards.
Latest leaks suggest faster than a plain 6800 and a bit slower than the 6800 XT. No wonder it was MIA so long.

Rumour is also 200mm2 for the GCD and 4 MCDs for around 38 billion transistors for Navi 32. Navi 31 was 27 billion. 11 billion (+40%) for worse performance? That seems poor.

While they have saved same money going chiplet, I suspect a lot of that is at the design stage.

Which if your #1 priority is to have margins despite tiny volume make some sense.
 
I'm going to assume the rumours that AMD though RDNA3 was faster per core than it actually is, the reason its taken so long to release is because they tried to get the performance back through drivers.

Or, they just thought #### it lets make a crap 7800XT because reasons.

If this was ATI I would trust they were onto something like this. If the core clocks very high and is a more balanced core maybe it will justify what they are doing.

But I don't trust them at all now. It wouldn't surprise me one bit they price it £30 less and add 4GB of VRAM more and say job done. A technical victory but against a very easy target.

I would love to be proved wrong,but my expectations are very low now.

AMD giving up at the high end again? Would not surprise me, i've explained many times why that's not a bad idea.

With 10% market share and having to be 10 to 20% cheaper just to maintain that you're never going to earn enough to R&D high end cards.

I really don't get Twitter at times. AMD is going chiplets,so the whole point of chiplets is to make smaller chips,and use them as an MCM.

If AMD used dual Navi 32 chips they would have more shaders than Navi 31. It would also mean less costs in designing in different chips.


Latest leaks suggest faster than a plain 6800 and a bit slower than the 6800 XT. No wonder it was MIA so long.

Rumour is also 200mm2 for the GCD and 4 MCDs for around 38 billion transistors for Navi 32. Navi 31 was 27 billion. 11 billion (+40%) for worse performance? That seems poor.

While they have saved same money going chiplet, I suspect a lot of that is at the design stage.

Which if your #1 priority is to have margins despite tiny volume make some sense.

Well if they are putting off people who want to get an AMD card,how are they going to get people who care less to buy AMD?

On top of this if I don't upgrade my dGPU,then I will not bother upgrading my CPU platform and so will other people. So that means less CPU sales for AMD.

If in a few years I can't be bothered,then I might end up with a console. So even if AMD has another technical win there,it's lower margin than their desktop products.
 
Last edited:
In that case its probably running a higher base resolution.
No, its tiers are exactly the same as FSR/DLSS, with Quality being 1.5x scaling, Balanced 1.7x and Performance 2x. XeSS also has an Ultra Quality mode like FSR 1 though, which is 1.3x scaling. AMD and Nvidia elect not to offer that for whatever reason. XeSS just performs worse on non-Intel hardware because it's optimised for the XMX cores on Arc cards. It reverts to a less performant DP4a path on RDNA 2+/Pascal+ cards and a fallback INT24 mode on anything older (which as noted is useless and worse than native).
 
I really don't get Twitter at times. AMD is going chiplets,so the whole point of chiplets is to make smaller chips,and use them as an MCM.

If AMD used dual Navi 32 chips they would have more shaders than Navi 31. It would also mean less costs in designing in different chips.

I don't think its quite that simple, it may not be possible without a lot more R&D, its taken AMD decades to get this far and others have yet to catch up.
 
One has to ask why is AMD using Navi 32 as the RX7800XT,which has less shaders than the previous generation Navi 21? Why not use a cutdown Navi 31?
We've already seen how Nvidia's cards using less CUDA cores than the previous generation has worked out so I'm expecting this to get a similar panning in reviews unless its less than $500 which is the current price of the 6800XT.
 
I don't think its quite that simple, it may not be possible without a lot more R&D, its taken AMD decades to get this far and others have yet to catch up.

Yes,but people saying AMD is only a midrange core for RDNA4,could argue a Zen chiplet is also midrange too. I expect AMD splitting the GCD and MCD is similar to what happened with Zen1 where they did MCMs for the Threadripper CPUs. Then next generation they start splitting the GCD into dual chiplets like Zen3 did.

So RDNA4 being midrange,well yes the GCD is midrange.After all,from an R and D standpoint,if they end up only having to make two designs now(low end monolithic design and one chiplet design),it will be more cost effective.

After all cDNA has chiplet dGPU designs already.

We've already seen how Nvidia's cards using less CUDA cores than the previous generation has worked out so I'm expecting this to get a similar panning in reviews unless its less than $500 which is the current price of the 6800XT.

Unless it is some performance monster,probably RX6800XT level performance at an RX6800 level price,with 16GB of VRAM. Under $500 wouldn't be so bad but I have zero expectations now.
 
We've already seen how Nvidia's cards using less CUDA cores than the previous generation has worked out so I'm expecting this to get a similar panning in reviews unless its less than $500 which is the current price of the 6800XT.
$50 less and free watts ltd too is about as "good" as we can hope for now IMO. And hope is a very strong word here, since I certainly barely care. If it was more like $350 then maybe I could care
 
Yes,but people saying AMD is only a midrange core for RDNA4,could argue a Zen chiplet is also midrange too. I expect AMD splitting the GCD and MCD is similar to what happened with Zen1 where they did MCMs for the Threadripper CPUs. Then next generation they start splitting the GCD into dual chiplets like Zen3 did.

So RDNA4 being midrange,well yes the GCD is midrange.After all,from an R and D standpoint,if they end up only having to make two designs now(low end monolithic design and one chiplet design),it will be more cost effective.

After all cDNA has chiplet dGPU designs already.



Unless it is some performance monster,probably RX6800XT level performance at an RX6800 level price,with 16GB of VRAM. Under $500 wouldn't be so bad but I have zero expectations now.

True, who knows what going though these peoples heads.

One could argue Navi 31 is mid range, the logic die is only 304mm, its smaller than an RX 6700XT, and yet the overall package is around 550mm. there's six 47mm IMC / Cache chips on it, you would hope these people would understand that before tweeting, or is it X'ing now? What is it with X, why is everyone obsessed with X? RTX, GTX, XFX RX 7900XTX.
 
Last edited:
Disappointed but to be expected, AMD was never out to beat Nvidia as they have said they only want to be on par or be the better value. Can't see for the foreseeable future AMD doing anything great, they've missed so many open goals and opportunities, its clear they don't want to win.. just compete enough to get by.
 
$50 less and free watts ltd too is about as "good" as we can hope for now IMO. And hope is a very strong word here, since I certainly barely care. If it was more like $350 then maybe I could care

If it needs to be nearly half the price then frankly i don't think AMD should care what you think, no one can run a profitable business like that and if i can get something a bit better than what Nvidia are offering for 10% less then i think AMD should keep going.
 
Back
Top Bottom