• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD to launch 300W GPU with High-Bandwidth-Memory

Soldato
Joined
2 Jan 2012
Posts
12,405
Location
UK.
LinkedIn profiles of AMD employees revealed two important facts.
AMD Fiji: 300W 2.5 GPU with HBM

The first profile of Linglan Zhang, System Architect Manager (PMTS) of AMD, revealed that the development of first GPU with HBM has been completed.

Developed the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer – Linglan Zhang

We’ve been speculating about Fiji silicon purportedly using High-bandwidth-memory (HBM) for almost a year. As it turns out, it could be true after all. The most disappointing part of the quote is definitely the power consumption, as 300W are more than we expected. The fact that AMD was forced to manufacture GPU with such a high TDP could only mean that it is still manufactured in 28nm process. NVIDIA’s Maxwell GM200 will likely feature lower power consumption (I will be surprised if it’s higher than 250W).

http://videocardz.com/54265/amd-to-launch-300w-gpu-with-high-bandwidth-memory
 
I know a lot of peeps will say 300w TDP is to much, but 290X is 290w and in fact uses less at stock. AMD seem to be add a bit over the odds with their TDP while Nvidia seem to put TDP lower than it actually is.

If this is the 380X to take on 970 / 980 and then a die shrink in the summer for the 390 / 390X, I think it will be fine. Just release this already plz !!
 
Also it needs to be said that while AMD generally give a TDP that's higher than their actual use. Nvidia bend the truth somewhat about their TDP.

Let's look at the GTX 980 TDP '165w' VS 290X TDP '290w' VS reality.

Claimed difference by Nvidia '125w' Actual difference '29w'

8ex80fD.png


iOaRRKR.png


Reference 980 can actually use up to 280w..
 
So imagine like you say if Furmark doesn't allow the cards to max out, yet the 980 is pulling well well well over it's TDP. How much would it pull if Furmark could max it out.

Think about it you thoroughly intelligent people :rolleyes:

A 300w TDP AMD card should be fine, a 300w TDP Nvidia wouldn't as it would use about twice that in reality :p
 
The power consumption on Maxwell is not universal for all things.

Some games its pretty good, other games it's no better than a 290.

+1

Some of these guys have no idea. Just spout what they read. There are loads of benchmarks showing 980's pulling massive watts during gaming just as much as Titan Black etc..

I own a 290X and a 980, I know how much power they use. Facts never stop the sheep though :p:D

Lulz.

Personally looking forward to AMD's next card, if it's a power sucking performance monster so be it. Those who are all of a sudden interested in saving the planet and tree hugging can buy a GTX 960 everybody's a winner :D

Hurry up and release the monster plz AMD !!

Also good call on 20nm Orangey, looks like it's a turnip. I guess it's a long wait for 16nm now then? http://www.fudzilla.com/news/graphics/36721-20nm-gpus-not-happening
 
Last edited:
If nVidia released a 300W beast I would but not AMD :)

Haha :D

I can't wait for AMD to release this thing, because we're moving closer to a more fully enabled Maxwell card, and then pick between the two monster cards.

I want Maxwell with more cores, and at least 8GB's of vram. If not I'll prob go 390X. AMD do need to fix IDLE temps / power use on multi screens though, atm it's dire. Hopefully stacked mem can address this??

Don't care about TDP on either card. Because when you overclock TDP goes out the window anyway.

My 980 is @ 1500mhz and sucks a fair bit of juice.
 
any chance you could answer that, boomstick? or anybody?

crysis3 noted, thanks Kaap

KN8Rh5g.jpg

http://www.tweaktown.com/reviews/68...0-4gb-strix-oc-video-card-review/index16.html

Yeah at default settings the 980 is indeed more power efficient than it's predecessors, and more so than the 290X, once you add an overclock or really stress the GPU @ stock that low TDP goes out the window :D

I've got a GTX 980 which sits @ 1500 core all day long for gaming, it pulls about as much from the wall as my old Titan Black did at about 1200 core.

I just get whichever card is fastest, atm that is the 980. I don't really consider the TDP in broad terms because I'm not gonna be running at stock if you see what I mean.

If the 300w AMD card is faster than the 980, than I'll grab that and visa versa if it's GM200. Once a single card can handily cope with 1440P, than I'll pick whichever uses least power at that performance level. We're just not there yet. Probably 2016 for a midrange card (Power efficient) to do really well @ 1440P imho. Then I can settle for a lower down tier card that doesn't need as much juice. (Unless I move to 4K :p)

And we're back to the ****ing power consumption topic, this has been done over and over and over and the people who realize this is a stupid topic also realize its only relevant if your PSU is tiny and cant really handle 500-600W (Which my OCed system uses for reference). But i doubt it, you people who have 5960xs and 980s make this argument as if its a problem you have lol.

Electrical bills? Oh please, if you're that poor you shouldnt sit on ocuk and waste time.

+1, I won't mention it again. Just answering his question.
 
that's 3dmark, I'm asking about those games you were talking about :)



It's not about how much the cards consume, i dont really care. I made a thread and had a poll when i bought my 970 because i couldnt decide between that and a 290x - it was that close that i threw the vote to the forums. Boomstick mentioned the massive watts pulled by the 980 during gaming - I just wanted to know what games, because i can't find anything to suggest that. so not, it's not about power consumption - its about misinformation.

I guess i'll pull my power meter out and do my own testing then, nobody's show me anything yet :p


The power consumption difference between gaming (Any game take your pic) is negligible between OC GTX 980 / OC Titan Black. I don't how else to write it. I've tried all the cards personally. This is my own experience not stuff I've read..

Any power efficiency you get at stock is lost once you overclock. I assume most of us here are overclockers, so buying a GPU based on efficiency seems moot when you consider that when overclocked that efficiency goes out the Window.

Had the 980 been on a die shrink the power savings would have been greater, but there's only so much magic you can achieve on the same node. Maxwell saves most of it's power by saving power when the workload isn't optimal for shared resources. Not at full GPU load. Maxwell + die shrink would have given a much larger drop in power use as peak load power use would have also dropped (At same performance level).

Think we should put this to bed now lol.

Whichever card has most performance / OC headroom will get my money.
 
Last edited:
Everyone needs to shrink, I think the fact the hot topic on Maxwell thus far has been about power consumption rather then outright performance tells us 28nm is dead, dead boring.

28nm on gpus is now 3 years old...time for it to die a peaceful death and for amd and nvidia to progress.

Yeah a die shrink is long overdue. I wander what the profit margin is on 28nm at this point..
 
I get all this talk of TDP and how even if you cool the card the heat still goes somewhere but then I remember my 470 and how hot it got and then I stuck a ghelid Icy Vision on it and it knocked 20+ degrees off it and all that heat seemed gone or was it, jesus! Anyhow doesn't look like a 4K killer card is coming anytime soon, not while on 28nm. :(

4K killer card is years away, prob 2017 for a good 4K single GPU.
 
It makes me wonder with the R9 300 series launching in 2015,the GM200 based GPUs in the next month or two,the GTX960 in a week or so and so on,whether we will be still having 28NM GPUs well into 2016?

You would expect at least a 1 year shelf life for all these launches,maybe a bit longer??

Its annoying that AMD has not even got a full new midrange series out now - the R9 285 is not even fully enabled either,and the R9 280,R9 280X,R9 270X,R9 270 and R7 265 are around three years old too.

They are literally just helping Nvidia.

Lol, yeah it really is time to move on. Fingers crossed we get a die shrink by late summer this year. If not very early 2016. Hopefully transitions to other nodes won't be as difficult as moving from 28nm has been, i.e don't want to move from 28nm to only get stuck at 16nm/14nm for another 3 years :p

Hopefully this is a bump in the road and things will move along faster.. The race for 4K might dictate a faster pace.. I hope so anyway..
 
4K screens are becoming more attainable, the demand for GPU's that can game @ 4K will rise and drive the market.

We've stagnated at 1080P for to long imho. Cards from two gens ago can cope with this res. It's time to move forward, next gen screens and next gen cards to power them.

I bet AMD / Nvidia are more than happy to churn out another round of 28nm parts. Savings for them must be lovely. People buying them in droves as well. 970 / 980 have sold 1 million since Sept, the market is def ready for something even better..

28nm is a bump in the road, a big one and I'm not to sure if AMD / Nvidia are even in a rush to move on.. Once real competition kicks off @ 4K performance improvements will be massive again, just like when we went from 1280 x 1024 to 1920 x 1080.
 
Back
Top Bottom