• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The AMD Navi Thread **

Associate
Joined
24 Nov 2010
Posts
2,314
DP was saying the chiplets idea is stupid and only noobs suggest it. He never did elaborate why though when I asked :)

I'm sure that's why AMD have been pursuing it for the better part of a decade, and intensified work on it it under Raja, and they began implementing Infinity Fabric into Vega, and NVIDIA formally acknowledged they'd begun research into it and announced plans to do it in future, a couple of years ago. If they can leverage the kind of low cost (small dies), high yields, low design costs (modularity) and scaling that Zen has, then that's a huge asset.

I don't think the software / firmware / API interface with the driver is the issue. I think scaling IF (or a similar interconnect) to the kind of bandwidth that modern GPUs need, within certain package size / TDP / cost is the holdup. If they can't find a solution for it in commercial products in the next 2 years, I could see them going optical.
 
Soldato
Joined
26 Sep 2010
Posts
7,157
Location
Stoke-on-Trent
DP was saying the chiplets idea is stupid and only noobs suggest it. He never did elaborate why though when I asked :)
He made suggestions of Crossfire on a PCB is stupid because multi GPU support is flaky at best these days, which is true. He didn't elaborate on strapping processing chiplets together like Zen 2.

I guess if there's a central management chip which handles distributing workloads to the chiplets then all the graphics card has to expose to the rest of the system is that central manager, so there's not "multiple GPUs" for software to try and wrangle. But I'm no chip engineer so whether this is even doable right now I have no idea.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Radeon RX 680 (Navi 10 XT)
7nm TSMC
2560 shaders
160 TMUs
64 ROPs
256-bit MI with 8 GB GDDR6
PCIe 4.0

Performance between RTX 2070 and Radeon VII.
 
Associate
Joined
17 Sep 2018
Posts
1,432
https://wccftech.com/amd-radeon-nav...shader-engines-2560-sps-8gb-vram-256-bit-bus/

Interesting if true

According to this leak, as well as our own sources, the Navi chip features 40 compute units, each housing 64 stream processors. The new previously unknown detail that this leak brings to the table is that Navi allegedly features 8 shader engines. Each of these engines will house 5 compute units and a raster back-end. Normally, in existing GCN GPUs, a raster back-end would include 16 Render Output Units, or ROPs for short. So it’s unclear if Navi 10 will feature 128 ROPs or if the company is cutting down the number of ROPs per render back-end with Navi.

Now, the number of shader engines in Navi is actually a significant detail, because one of the major architectural limitations of AMD’s previous GCN implementations is the inability to scale the microarchitecture beyond 4 shader engines per die. This has led to underutilization of stream processors in the company’s bigger GPUs such as Fiji and Vega and has long been a known limitation of GCN that the company was working to develop solutions to.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
I guess if there's a central management chip which handles distributing workloads to the chiplets then all the graphics card has to expose to the rest of the system is that central manager, so there's not "multiple GPUs" for software to try and wrangle. But I'm no chip engineer so whether this is even doable right now I have no idea.


That is the crux of it isn't it, getting a chiplett design to work as a large monolithic core.

look at a monolithic core.

dieshot.jpg


This Vega die shot will do, is it not really just 8 chiplets hooked up to some IO logic to bind it all together.

Yes we know that its not easy, else someone would have done it by now.

I assume the timings and latencies involved are crucial to getting it to work well

Hopefully If not AMD someone will bring us this breakthrough soon.
 
Soldato
Joined
18 May 2010
Posts
12,758
It'll do well to match that value proposition imo Vega 56 is incredible value right now. I mean even if it matches Vega 56 at £250, Navi 10 Pro will have to be 2070 level at £400 for it to make any sense.

I think I'm going to set my expectations at Navi being a 590 replacement and below, I just cant see it bettering Vega cards at price/performance for people who already have them and especially if there is still stock around when Navi launches at current Vega prices it makes no sense to me. I learnt my lesson on hype when Polaris came out and I ended up keeping my 290 for another year or so so I'm fully GPU release pessimistic these days
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,582
Location
Greater London
He made suggestions of Crossfire on a PCB is stupid because multi GPU support is flaky at best these days, which is true. He didn't elaborate on strapping processing chiplets together like Zen 2.

I guess if there's a central management chip which handles distributing workloads to the chiplets then all the graphics card has to expose to the rest of the system is that central manager, so there's not "multiple GPUs" for software to try and wrangle. But I'm no chip engineer so whether this is even doable right now I have no idea.
Oh, I thought it the the chiplets he took issue with. CrossFire is dead and anything that would need it is obviously be silly. We need something that would act like a single gpu. No developer will put in the work to make multi gpu work just for pc, unless it is a pc exclusive game like say Star Citizen.
 
Caporegime
Joined
1 Nov 2003
Posts
35,691
Location
Lisbon, Portugal
Sorry if this is a repeated question and potentially dumb question. But, lets assume the leaks for the 3080 or rather Navi XT are correct. Does this mean it would perform better or worse than the Vega 56? I am just thinking ahead right now for an upcoming build
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Sorry if this is a repeated question and potentially dumb question. But, lets assume the leaks for the 3080 or rather Navi XT are correct. Does this mean it would perform better or worse than the Vega 56? I am just thinking ahead right now for an upcoming build

Better, but more expensive.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,164
I guess if there's a central management chip which handles distributing workloads to the chiplets then all the graphics card has to expose to the rest of the system is that central manager, so there's not "multiple GPUs" for software to try and wrangle. But I'm no chip engineer so whether this is even doable right now I have no idea.

It isn't just a software problem - to either just spread out a monolithic design or a more advanced form using a mixture of command and processing packages needs a significant architecture redesign or you still run into the same issues with under-utilisation.
 
Associate
Joined
21 Apr 2007
Posts
2,487
Oh, I thought it the the chiplets he took issue with. CrossFire is dead and anything that would need it is obviously be silly. We need something that would act like a single gpu. No developer will put in the work to make multi gpu work just for pc, unless it is a pc exclusive game like say Star Citizen.

tbh I was hoping that would be Navi, that would be a game changer or even a GPU per screen segmentation so you could boost VR systems
 
Soldato
Joined
26 Sep 2010
Posts
7,157
Location
Stoke-on-Trent
They have been trying to leave the budget brand behind for years, but haven't had the performance. I still think they need to get market share first though.
But if AMD really have managed to double the geometry engines then the performance will be coming back. I enjoyed Moore's Law Is Dead's analysis of how a $499 Navi could and would still work at shaking up the marketplace, and I concur with his logic, but I still think the price points are a little too high to get that initial head turn from the average consumer. Enthusiasts will be more than capable of assessing Navi's price/perf and comparing to Nvidia, but with the mind share so heavily in Nvidia's favour a 2070 competitor for about 2070 money is just not going to get their attention.
 
Associate
Joined
17 Sep 2018
Posts
1,432
But if AMD really have managed to double the geometry engines then the performance will be coming back. I enjoyed Moore's Law Is Dead's analysis of how a $499 Navi could and would still work at shaking up the marketplace, and I concur with his logic, but I still think the price points are a little too high to get that initial head turn from the average consumer. Enthusiasts will be more than capable of assessing Navi's price/perf and comparing to Nvidia, but with the mind share so heavily in Nvidia's favour a 2070 competitor for about 2070 money is just not going to get their attention.

It also makes them more competitive if they can pull it off. Right now they really aren't competitive from £350-550 numbers wise. I mean Vega 64 maybe but nothing over £400 for sure, until you hit Radeon 7

From £100-£300 they are super competitive in price per performance. So from the perspective of marketshare they are losing most in this segment. If they can beat Nvidia at price per perfomance they increase some market share.

Bringing in a Navi 56 @ £250 doesn't do a whole lot when they're doing that already

Also people will prefer to buy Radeon when they hear 'Woah Radeon Navi has the hottest high performance in value terms', then think 'oh well I'll buy a lower level one because I can't afford that'. They win some mind share here but also market share because they can't compete in this segment right now.
 
Back
Top Bottom