• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

What’s the difference with a Tier 1 card. Is it just cherry picked chips or something?

For the 3080 series there were 3 different chip bins shipped to the AIB's, this was the first time Nvidia shipped pre-binned chips to AIB's. The AIB's then tiered the binned chips with boards that had increasing numbers of phases and componentry. Some of the AIB's did use binned dram.

The tear downs for the Asus Reference, TUF and Strix (EVGA, Galax and Colorful were similar) all showed a marked difference in the boards, clock speeds and power-limits attached to the bins.

For the Navi 69xx series, AMD did similar and shipped at least 3 different chip bins on the same chip (K-XTXH), with the reference boards (Tier 3) being 0% over stock, Tier 2 being around 5% over stock and the Tier 1 boards being clocked at around 12% over stock.

These tier 1 boards with the top chip bins were the '6950XT Toxic Extreme Limited Edition', the 'Asrock 6950XT OC Formula' and the 'ROG Strix LC Radeon OC Edition'.
 
And yet prices also went down, including fuel and gas.
And yet prices also went down, including fuel and gas.

host images
Yes, it’s easy to cherry pick some prices that went down, but that doesn’t change the fact that many more went up in variety of sectors, which doesn’t need any research to confirm. Just look at overall inflation rates - the highest they’ve been in 25 years, and in many regions by 500-1000%
 
Last edited:
Lol, gave you example of other computer components that went down in price, including energy prices (basically what we're talking here) and you go along about cars and other sectors which have nothing to do with it.
BTW, currios how that inflation increase the price of 4090 from 3090 with just shy of 7% while it skyrocket the price of 4080 for instance :cry:

Nevermind that prices for laptops also went down and, like I've said previously, for sure those have more components in them than a GPU.
 
Last edited:
Why is it assumed that variants of Navi32 dies won't be used for both the RX 7800 XT and RX 7700 XT?

Surely, the RX 7700 XT isn't going to have less Compute Units than the 6700 XT (40)?

Or, we could easily get a RX 7700XT, RX 7800 (non XT) and RX 7800 XT all based on Navi32.

The Navi32 GCD die is only around 200 mm² on TSMC N5. Not exactly massive. Smaller than the PS5 and Series X 7nm GPU dies.

This is the only bit of verifiable info that I've actually found on RDNA3:

This suggests Navi 33 will be mostly used for mobile GPUs (at first) "Navi33 is the mobile-first push for AMD. They expect robust sales of AMD Advantage laptops with it, as the design is drop-in compatible with Navi23 PCBs, minimizing OEM board re-spin headaches".
 
Last edited:
Someone correct me if I am wrong but I'm pretty sure the reason amds pr slides had very different numbers was because for their rdna 3 system they were using 7950x with sutpidly fast DDR 5 ram where as the 6950xt system as using a 5950x with ddr4? If true.... then amd were 100% intentionally misleading their fanbase/customers......
AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2071_575px.png



So for RT it was indeed only a 5900x while for other scenarios it was 7950x and whatnot.
You can dig through the endnotes if you're curious about each test. :)

 
Surely, the RX 7700 XT isn't going to have less Compute Units than the 6700 XT (40)?
Why not? It looks like the 7800XT will only have a max of 60CUs and that's only if they use the full N32, there could also be a 7800 non X though I guess AMD would have the option of ditching this and beefing up the 7700XT to an N32 if nvidia is better than expected but I doubt that since it looks like a race to the bottom right now between the pair.

Going off the performance of an 84CU 7900XT barely beating an 80CU 6950XT doesn't bode well for anyone on a 6800XT or below that's looking for an upgrade at a similar cost to what they paid previously.

Thankfully for AMD, Nvidia's cards look just as bad with the 4070 rumoured to only use the same amount of cuda cores as a 3070 and with Nvidias new pricing in place will probably cost as much if not more than a 3080 did while being a bit slower.
 
AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2071_575px.png



So for RT it was indeed only a 5900x while for other scenarios it was 7950x and whatnot.
You can dig through the endnotes if you're curious about each test. :)


Somehow providing the documentation for how they got the numbers is "misleading", maybe to people that don't know the craic and can't read....They also include these slides at the end of videos that have benchmarks\performance numbers (and have done for years) so the info was always out there.
 
Last edited:
Lol, gave you example of other computer components that went down in price, including energy prices (basically what we're talking here) and you go along about cars and other sectors which have nothing to do with it.
BTW, currios how that inflation increase the price of 4090 from 3090 with just shy of 7% while it skyrocket the price of 4080 for instance :cry:

Nevermind that prices for laptops also went down and, like I've said previously, for sure those have more components in them than a GPU.
Cars are being massively impacted by many of precisely the same issues - IC and other electronic component availability / costs. I run a business that is affected by these components too, and am regularly competing with car manufactures for the very same components. And they almost always win out, as they have greater buyer power and pockets - further pushing the prices up.

What we’re discussing here is the high prices for GPUs, and there are many contributing factors that most people seem either oblivious too, or just playing ostrich.
 
Somehow providing the documentation is "misleading", maybe to people that don't know the craic and can't read....They also include these slides at the end of videos that have benchmarks\performance numbers so the info was always out there.

This ^^

How often it is the case that people don’t read what is in front of them, but rather see/hear what they want to ;-) And then it’s the author’s fault for not making it clear enough.
 
Why not? It looks like the 7800XT will only have a max of 60CUs and that's only if they use the full N32, there could also be a 7800 non X though I guess AMD would have the option of ditching this and beefing up the 7700XT to an N32 if nvidia is better than expected but I doubt that since it looks like a race to the bottom right now between the pair.
Navi33 is the trash tier die (128-bit GDDR6, 32 MB Infinity Cache). These products would get trashed by reviewers if used in mid-high tier products. The Navi22 based 6700XT (6700 also) was largely considered to be good value, and produced in decent quantities. It includes 96 MB of Infinity Cache.

Where'd you get your info from? Or is it just opinion?
 
Last edited:
Cars are being massively impacted by many of precisely the same issues - IC and other electronic component availability / costs. I run a business that is affected by these components too, and am regularly competing with car manufactures for the very same components. And they almost always win out, as they have greater buyer power and pockets - further pushing the prices up.

What we’re discussing here is the high prices for GPUs, and there are many contributing factors that most people seem either oblivious too, or just playing ostrich.
I work in automotive industry and I'm aware of its problems. Chip shortage is just one of them.

You've answered yourself why other of your points don't hold:
- you're a small player importing components to UK/EU. AMD and nVIDIA are large fishes calling first dips when needed and only move the components locally since everything gets made into that part of the world. Energy costs are low still, transportation cheap, etc. In Europe prices will be much higher since we don't make this stuff locally anymore to which you add the costs with energy. So yeah, you're not competitive anymore (or at least not to the same degree).
- SSD/NVME, RAM, PSU, CPUs, etc. are also competing for the same components like you do. Yet, the prices for those are the same or even going lower.
- a laptop needs all the above, with all the added cost of each one putting their profit and top and still follow the same trajectory as those components.
- phones... Plenty and very cheap for what they do.

A small increase in price is expected, 4090 got about a 6.67% increase. What they did with the 4080 and what AMD does is just greed.
Navi33 is the trash tier die (128-bit GDDR6, 32 MB Infinity Cache). These products would get trashed by reviewers if used in mid-high tier products. The Navi22 based 6700XT (6700 also) was largely considered to be good value, and produced in decent quantities. It includes 96 MB of Infinity Cache.

Where'd you get your info from? Or is it just opinion?
Does it matter anymore what die will have? 4080 has a little worse SM ration to the 4090 compared to 3070 vs 3090 and yet people where mentioning that 4080 12gb was the 4070 when in fact that was the 4060ti version and 4080 16gb is actually the 4070 of this gen.

So it won't matter. AMD dropped the ball on performance and it appears it doesn't want to sweeten the deal through better pricing.
 
Somehow providing the documentation for how they got the numbers is "misleading", maybe to people that don't know the craic and can't read....They also include these slides at the end of videos that have benchmarks\performance numbers (and have done for years) so the info was always out there.
Even if you could read all that, how would you go about restating those numbers. Should we reduce them by 1%, 5%, 10%, 20%....

The data shared even with a well publicized fine print is useless and lacking in transparency.

Theres some base sanctity of method to be maintained while conducting these tests, otherwise i would rather not like to see these obviously misleading (and meaningless) comparisons. I hope they didn't use the keyword "benchmark" anywhere in their presentation, because no amount of fine print can be used to redefine the term.. it's definitely looking worse for amd
 
Last edited:
Does it matter anymore what die will have?
Yes it matters. No one in their right minds will buy a RX 7700 XT if it is worse in almost every way spec wise, than the RX 6700 XT. Especially if it is priced higher, and it almost certainly will be, based on the prices of Navi31 cards. AMD's marketing just isn't that good, that ppl will buy regardless.

Navi33 is trash tier, the 1080p performance will be acceptable, but not ideal for 1440p. It is likely to be used for the RX 7600 or 7600 XT, since the Navi23 cards had a spec of a 128 memory bus and 32MB of Infinity Cache, just the same as Navi33. I doubt there will be a Navi33 GPU clocked much higher than the RX 6650 XT, which is already clocked at 2635 Mhz. I'd guess that the mobile Navi33 chips will be released first.

The speculation about cards not using Navi32, is based on a lack of actual facts or news from AMD, regarding this die, which is probably quite deliberate.

The logic that existing Navi31 cards weren't as fast as some hoped (even though the 7900 XTX is slightly ahead of the RTX 4080 in 1440p and 4K), therefore the rest of the series will somehow be slower than current RDNA2 equivalents, is based on an excess of pessimism. There's still a space for a Navi 31 card, aka 7950 XT /XTX to be released later on.
 
Last edited:
Even if you could read all that, how would you go about restating those numbers. Should we reduce them by 1%, 5%, 10%, 20%....

The data shared even with a well publicized fine print is useless and lacking in transparency.

Theres some base sanctity of method to be maintained while conducting these tests, otherwise i would rather not like to see these obviously misleading (and meaningless) comparisons. I hope they didn't use the keyword "benchmark" anywhere in their presentation, because no amount of fine print can be used to redefine the term.. it's definitely looking worse for amd
Take a look at the NVIDIA presentation and the huge fibs they told. Then there’s the power socket melting and wait for it the 4090 coil whine thread. Add that up and even your rose tinted NVIDIA glasses cannot ignore that little lot.
 
Yes it matters. No one in their right minds will buy a RX 7700 XT if it is worse in almost every way spec wise, than the RX 6700 XT. Especially if it is priced higher, and it almost certainly will be, based on the prices of Navi31 cards. AMD's marketing just isn't that good, that ppl will buy regardless.

Navi33 is trash tier, the 1080p performance will be acceptable, but not ideal for 1440p. It is likely to be used for the RX 7600 or 7600 XT, since the Navi23 cards had a spec of a 128 memory bus and 32MB of Infinity Cache, just the same as Navi33. I doubt there will be a Navi33 GPU clocked much higher than the RX 6650 XT, which is already clocked at 2635 Mhz. I'd guess that the mobile Navi33 chips will be released first.

The speculation about cards not using Navi32, is based on a lack of actual facts or news from AMD, regarding this die, which is probably quite deliberate.

The logic that existing Navi31 cards weren't as fast as some hoped (even though the 7900 XTX is slightly ahead of the RTX 4080 in 1440p and 4K), therefore the rest of the series will somehow be slower than current RDNA2 equivalents, is based on an excess of pessimism. There's still a space for a Navi 31 card, aka 7950 XT /XTX to be released later on.

At the end of the day people will look at performance and ignore the rest. If they can get the performance on 64bit, with under 100mm sq. die size, AMD will sell it like that at $500 and the crowd will be "but inflation, but R&D, but etc., price is fine". Options will be zero since nVIDIA is playing the same game.

Also about specs... 4080 has 256bit, 16GB vRAM 379mm sq die size, 45,9 billion transistors and is doing just fine against the 384bit, 24GB vRAM, 520mm sq die size, 57,7 billion transistors of the 7900xtx or vs. 320bit, 20GB vRAM, 520mm sq die size, 57,7 billion transistors on the lol worthy 7900xt.
 
At the end of the day people will look at performance and ignore the rest. If they can get the performance on 64bit, with under 100mm sq. die size, AMD will sell it like that at $500 and the crowd will be "but inflation, but R&D, but etc., price is fine". Options will be zero since nVIDIA is playing the same game.

Also about specs... 4080 has 256bit, 16GB vRAM 379mm sq die size, 45,9 billion transistors and is doing just fine against the 384bit, 24GB vRAM, 520mm sq die size, 57,7 billion transistors of the 7900xtx or vs. 320bit, 20GB vRAM, 520mm sq die size, 57,7 billion transistors on the lol worthy 7900xt.
520 is the size of the interposer, the GCD itself is only 300mm2
 
Last edited:
Back
Top Bottom