• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD to launch 300W GPU with High-Bandwidth-Memory

Soldato
Joined
2 Jan 2012
Posts
12,400
Location
UK.
LinkedIn profiles of AMD employees revealed two important facts.
AMD Fiji: 300W 2.5 GPU with HBM

The first profile of Linglan Zhang, System Architect Manager (PMTS) of AMD, revealed that the development of first GPU with HBM has been completed.

Developed the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer – Linglan Zhang

We’ve been speculating about Fiji silicon purportedly using High-bandwidth-memory (HBM) for almost a year. As it turns out, it could be true after all. The most disappointing part of the quote is definitely the power consumption, as 300W are more than we expected. The fact that AMD was forced to manufacture GPU with such a high TDP could only mean that it is still manufactured in 28nm process. NVIDIA’s Maxwell GM200 will likely feature lower power consumption (I will be surprised if it’s higher than 250W).

http://videocardz.com/54265/amd-to-launch-300w-gpu-with-high-bandwidth-memory
 
I know a lot of peeps will say 300w TDP is to much, but 290X is 290w and in fact uses less at stock. AMD seem to be add a bit over the odds with their TDP while Nvidia seem to put TDP lower than it actually is.

If this is the 380X to take on 970 / 980 and then a die shrink in the summer for the 390 / 390X, I think it will be fine. Just release this already plz !!
 
This is an enthusiast's card, not one for people worried about leaving the standby light on their telly on.

If it's the quickest single core card, enthusiasts will buy it, irrespective of the power draw.
 
Also it needs to be said that while AMD generally give a TDP that's higher than their actual use. Nvidia bend the truth somewhat about their TDP.

Let's look at the GTX 980 TDP '165w' VS 290X TDP '290w' VS reality.

Claimed difference by Nvidia '125w' Actual difference '29w'

8ex80fD.png


iOaRRKR.png


Reference 980 can actually use up to 280w..
 
I don't care about power if it has performance to match. If it is 10% faster than the 980, that would be a disappointment.
 
because furmark is a totally reliable representation of in game usage

I think you need to read up on what TDP means if you want to have a go at nvidia about it

the top graph also shows total system power draw, not just the GPU
the 2nd then doesn't make sense as there is no way just a GPU is using 280w when a total system running one of the worst stress tests uses 294w
 
Last edited:
Also it needs to be said that while AMD generally give a TDP that's higher than their actual use. Nvidia bend the truth somewhat about their TDP.

Let's look at the GTX 980 TDP '165w' VS 290X TDP '290w' VS reality.

Claimed difference by Nvidia '125w' Actual difference '29w'

http://i.imgur.com/8ex80fD.png

http://i.imgur.com/iOaRRKR.png

Reference 980 can actually use up to 280w..

Don't AMD implement throttling in Furmark? therefore it's meaningless as the 290X power usage will be heavily capped by it's Power Limit. Both AMD and NVidia quote 'general usage' TDP figures. It's unfair to compare them based on software that many people class to be a "power virus" where one card (290X) throttles much more than the other.

Kaaps has already shown that there is a massive difference in TDP for general usage between GTX980 and 290X (1100W quad-SLI vs 1700W quad-Xfire).
 
Last edited:
Yer, 28nm has been around now for far too long.


Also in the news but dead boring -

20nm GPUs not happening

Yields bad beyond repair

We want to make sure that you realize that 20nm GPUs won’t be coming at all. Despite the fact that Nvidia, Qualcomm, Samsung and Apple are doing 20nm SoCs, there won’t be any 20nm GPUs.

From what we know AMD and Nvidia won’t be releasing 20nm GPUs ever, as the yields are so bad that it would not make any sense to manufacture them. It is not economically viable to replace 28nm production with 20nm.

This means the real next big thing technology will be coming with 16nm / 14nm FinFET from TSMC and GlobalFoundries / Samsung respectively, but we know that AMD is working on Caribbean Islands and Fiji as well, while Nvidia has been working on its new chip too.

This doesn’t mean that you cannot pull a small miracle in 28nm, as Nvidia did that back in September 2014 with Maxwell and proved that you can make a big difference with optimization on the same manufacturing process, in case when the new node is not an option.

Despite the lack of 20nm chips we still think that next gen Nvidia and AMD chips bring some innovations and make you want to upgrade in order to buy it to play the latest games on FreeSync or G-Sync monitors, or in 4K/UHD resolutions.

http://www.fudzilla.com/news/graphics/36721-20nm-gpus-not-happening
 
300w? No thanks and boom. Furmark isn't a game. It's irrelevant

+1

Furmark is useless to use for power consumption as the drivers don't allow the cards to run flat out anyway.


This story may have been a bit more believable if it had included "AMD working on a much improved, quieter cooler" but it did not.:D
 
Back
Top Bottom