Associate
- Joined
- 1 Oct 2020
- Posts
- 1,273
It's not just ROP's that nvidia cards were missing.So AMD's GPU is smaller than GB203, 379mm vs 350mm, on the same 4NM but AMD's GPU has more transistors? How did they do that????
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It's not just ROP's that nvidia cards were missing.So AMD's GPU is smaller than GB203, 379mm vs 350mm, on the same 4NM but AMD's GPU has more transistors? How did they do that????
I doubt they do, they may have the cards in hand but it's unlikely they have drivers for them.
9070XT is a large GPU no matter what AMD calls it
It has nearly 10 billion more transistors than the rtx5080!
![]()
AMD Navi 48 RDNA4 GPU has 53.9 billion transistors, more than NVIDIA GB203 - VideoCardz.com
Official transistor count for Navi 48 leaks out The first RDNA4 GPU size and transistor density revealed. AMD is sharing the first information on the next-gen GPU from the Radeon RX 9070 series. The first cards will feature the Navi 48 GPU, which is the larger GPU in the RDNA4 family...videocardz.com
So AMD's GPU is smaller than GB203, 379mm vs 350mm, on the same 4NM but AMD's GPU has more transistors? How did they do that????
So AMD's GPU is smaller than GB203, 379mm vs 350mm, on the same 4NM but AMD's GPU has more transistors? How did they do that????
It's not just ROP's that nvidia cards were missing.
AMD is probably having denser caches than Nvidia? Last generatio
Analogue vs Logic, Analogue being memory / cache is less dense than Logic, GB 203 has about 80MB of cache, Navi 32 has about the same 80MB of cache, i don't know how much cache Navi 48 has but its about the same size at 350mm.
Differences in packaging design can also increase transistor density, AMD are very good at packaging, its one of the things they specialise in, its something they do far netter than Intel which is one of Intel's problems vs AMD BUT..... the difference in transistor density here is 28%, that's flipping huge.
AMD has a lot of experience regarding their CPUs in optimising for density. They have been using large caches for both their CPUs and GPUs for a while now,but each generation they are doing more with less.
The RX7700XT has half the L3 cache of the RX6700XT,but only 12.5% more bandwidth but still almost 30% faster overall.
Fair point mate. Though as you can tell, I just don't think many folks noticed or cared for the 4070 dropping to that price, it was still a measly $50.The dropped the 4070 to $550 the moment the 7800 XT was launched for $500.
Fair point mate. Though as you can tell, I just don't think many folks noticed or cared for the 4070 dropping to that price, it was still a measly $50.
It would have been more impactful to drop $100 down to $500, but Nvidia were still reluctant to drop price by any meaningful amount. So I still stand by my point.
Either folks were going to buy Nvidia anyway or they weren't. And all it might have done is make AMD look 'only $50 cheaper'.
Even with the drop, the 7800XT was still a better card with more VRAM and raster performance. And not many folks would be turning on RT on a $500 GPU.
I guess we'll agree to disagree on that tangent.
Oh and performance-wise, I expect AMD will still be behind on FSR-vs-DLSS and RT, though hopefully not too far behind on RT. DLSS4 got a boost and even if a handful of old-timers like me can't stand DLSS/FSR, fans will defend those software technologies as 'adding value' somehow. Personally, even if there was a card that performed 1% better on price/performance, but had no DLSS/FSR, I'd pick that over the software-performance-crutch-nonsense. I consume old-fashioned real resolution at real frames only, thank you very much.
None of this changes the plain fact that AMD need to price the 9070 series to sell. They call them 70 class/midrange cards. Last gen the 7700xt/7800xt were mid range.
Hence the 9070 series should be following on from those and be priced similarly. Too far above $500 is no longer midrange, pushing into high-end, even though Nvidia is trying to push it as midrange. That price below $600 is where the masses can afford to buy, if they want sales, then price them where majority of folks can afford them.
That said, I expect AMD to disappoint/mess it up and price the whole 9070 series at $500+.
And even if AMD don't somehow mess up pricing, a discussion I had with someone during dinner today... well-priced desirable AMD GPUs will face stock issues if they end up becoming popular, resulting in the same situation as 5000 series, driving up prices beyond MSRP anyway. The market is starved for a decent GPU release that doesn't go up in smoke right now.
It's like the system is rigged such that consumer's can't score any wins.
So AMD's GPU is smaller than GB203, 379mm vs 350mm, on the same 4NM but AMD's GPU has more transistors? How did they do that????
Different types of silicon have different transistor density, that's how you can have two chips on same node but with different density. It could be for example that the 9070xt has more memory cache than the 5080 and maybe memory has higher density than GPU cores and perhaps that's why it has a higher density
Or maybe AMD's engineers are just very good at space efficiency and density compared to Nvidia
Analogue vs Logic, Analogue being memory / cache is less dense than Logic, GB 203 has about 80MB of cache, Navi 32 has about the same 80MB of cache, i don't know how much cache Navi 48 has but its about the same size at 350mm.
Differences in packaging design can also increase transistor density, AMD are very good at packaging, its one of the things they specialise in, its something they do far better than Intel which is one of Intel's problems vs AMD BUT..... the difference in transistor density here is 28%, that's flipping huge, so it can't be that either.
Do we know if AIBs for AMD actually improve the build quality on the cards over and above what AND deem as the base spec?
Fair point mate. Though as you can tell, I just don't think many folks noticed or cared for the 4070 dropping to that price, it was still a measly $50.
It would have been more impactful to drop $100 down to $500, but Nvidia were still reluctant to drop price by any meaningful amount. So I still stand by my point.
Either folks were going to buy Nvidia anyway or they weren't. And all it might have done is make AMD look 'only $50 cheaper'.
Even with the drop, the 7800XT was still a better card with more VRAM and raster performance. And not many folks would be turning on RT on a $500 GPU.
I guess we'll agree to disagree on that tangent.
Oh and performance-wise, I expect AMD will still be behind on FSR-vs-DLSS and RT, though hopefully not too far behind on RT. DLSS4 got a boost and even if a handful of old-timers like me can't stand DLSS/FSR, fans will defend those software technologies as 'adding value' somehow. Personally, even if there was a card that performed 1% better on price/performance, but had no DLSS/FSR, I'd pick that over the software-performance-crutch-nonsense. I consume old-fashioned real resolution at real frames only, thank you very much.
None of this changes the plain fact that AMD need to price the 9070 series to sell. They call them 70 class/midrange cards. Last gen the 7700xt/7800xt were mid range.
Hence the 9070 series should be following on from those and be priced similarly. Too far above $500 is no longer midrange, pushing into high-end, even though Nvidia is trying to push it as midrange. That price below $600 is where the masses can afford to buy, if they want sales, then price them where majority of folks can afford them.
That said, I expect AMD to disappoint/mess it up and price the whole 9070 series at $500+.
And even if AMD don't somehow mess up pricing, a discussion I had with someone during dinner today... well-priced desirable AMD GPUs will face stock issues if they end up becoming popular, resulting in the same situation as 5000 series, driving up prices beyond MSRP anyway. The market is starved for a decent GPU release that doesn't go up in smoke right now.
It's like the system is rigged such that consumer's can't score any wins.
Yea I hate the state of the gpu market as much as anyone, but saying stuff like "I'd take 1% performance boost over DLSS" in 2025 is like saying "I have a massive bizarre bias and can't be trusted to communicate reliably on this subject".I really like DLSS. If you watch the new HUB video, DLSS 4 has come a long way. To me it is indeed better than native if you have a 4K monitor.
By using it you get a lot more fps which means you can now enable RT. Trouble with AMD at least until now has been FSR has been relatively poor to the point users prefer native. That means struggling with RT and running at much lower fps. That's why given the choice i would take a 4070 over a 7800xt.
9000 series will hopefully change all that.
I really hope FSR4 is at least as good as DLSS 3. That would be a huge step up from FSR 3. What AMD should be doing is assign a small team to get in touch with as many FSR enabled games, especially sponsored ones to update them to FSR 4. That way going forward like DLSS you can just update the .dll file to get the latest version.
With a nice boost in RT plus FSR4, all the 9070XT needs is a nice price and it will sell very well. I don't buy the rubbish the price does not matter. It always matters. Especially at the lower end of budgets which is most people.
If the 9070XT is around 5070Ti performance and it indeed hits $550 I can see it getting glowing reviews almost everywhere.
I would say greed, good business. But the majority are willing to pay the asking (for the most part), so this is the new normal.Yea I hate the state of the gpu market as much as anyone, but saying stuff like "I'd take 1% performance boost over DLSS" in 2025 is like saying "I have a massive bizarre bias and can't be trusted to communicate reliably on this subject".
I can play ACC for the first time now, thanks to the new DLSS.
With that said, Nvidia selling a software patch and a mid-gen refresh as a full blown, stupidly expensive new generation is just IDK, neglect or greed or something
Do we know if AIBs for AMD actually improve the build quality on the cards over and above what AND deem as the base spec?
AMDs reference boards has always been overbuild and the reference cooler for the 6000 series were fine/alright. You would have to go back to Vega and before if you wanted an example of poor reference coolers, though they were pretty horrible. As long as AMD don't go backwards in their design or choose to do "an nvidia" and try pull 450+ watts through a single connector then I would have no problem with an reference board and cooler. In some cases I might even prefer it to some of the overpriced hippy designs out there. Regarding coil whine, I've heard that on even top end models so paying more is not a guarantee you wont get it, sadly. There are ways to lessen/remove coil whine which are free regardless of model so I'm not to worried about that personally.for the AMD 6000 series, absolutely, for this gen we will need to wait because there's some things that are different for this launch.
Typically we have to wait a few days after the full launch before it becomes clear who has engineered what in terms of power phases vram phases and general build quality.
For the 6000 series there were also higher quality vram chips on some of the higher end cards.
Need to remember that power control and power delivery are much more important on a 600w card than they are on a 330 watt card.
Typically, even a cheap 330 watt card will provide less hassle, especially in regards to coil whine and it's ability to set itself on fire.
For this reason, the 9070XT teardowns could look very barebones but that might actually be a good thing in terms of coil whine and reliability.
Lastly, let's not forget that EVGA pulled out of the market for a reason, that main reason being that they didn't feel they could continue to provide what they wanted under the margins Nvidia wanted to pursue. That says a lot about what Asus have had to work with for this gen.
They've said that FSR4 will be upgradeable in games that support FSR3.1. It'll probably be a driver toggle.I really hope FSR4 is at least as good as DLSS 3. That would be a huge step up from FSR 3. What AMD should be doing is assign a small team to get in touch with as many FSR enabled games, especially sponsored ones to update them to FSR 4. That way going forward like DLSS you can just update the .dll file to get the latest version.
Not quite, on-chip SRAM isn't analog, it's digital. It's also the densest way of storing data on-chip as long as your data is the right shape - if you have a deep but narrow memory e.g. 2048 deep x 8-wide. It's less efficient if you have a super wide but shallow array e.g. 8d, 2048w, and there are overheads per RAM macro which mean each process node will have some tipping point where a RAM macro is better than just an array of flip-flops.Analogue vs Logic, Analogue being memory / cache is less dense than Logic, GB 203 has about 80MB of cache, Navi 32 has about the same 80MB of cache, i don't know how much cache Navi 48 has but its about the same size at 350mm.
Differences in packaging design can also increase transistor density, AMD are very good at packaging, its one of the things they specialise in, its something they do far better than Intel which is one of Intel's problems vs AMD BUT..... the difference in transistor density here is 28%, that's flipping huge, so it can't be that either.