• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD to launch 300W GPU with High-Bandwidth-Memory

So imagine like you say if Furmark doesn't allow the cards to max out, yet the 980 is pulling well well well over it's TDP. How much would it pull if Furmark could max it out.

Think about it you thoroughly intelligent people :rolleyes:

A 300w TDP AMD card should be fine, a 300w TDP Nvidia wouldn't as it would use about twice that in reality :p
 
Intel, AMD and everyone else (including manufacturers of mobile SoC's) need to agree a set industry standard when quoting TDP numbers so when consumers are purchasing a card they know when they going through the marketing gump what's being told is true and fair.

In defence of Nvidia they have never made it a secret that there TDP's are based on the average TDP rather then at full load since there's no rules governing how you can market a products power usage in this country at least.

If I recall Nvidia did cap it's cards first when it came to running Furmark, it was around the time of GTX480 or maybe shortly afterwards with when they launched a dual GPU card that famously blow up while being reviewed.
 
So imagine like you say if Furmark doesn't allow the cards to max out, yet the 980 is pulling well well well over it's TDP. How much would it pull if Furmark could max it out.

Think about it you thoroughly intelligent people :rolleyes:

A 300w TDP AMD card should be fine, a 300w TDP Nvidia wouldn't as it would use about twice that in reality :p

How much would a 290X pull lol

I am afraid those figures using Furmark mean nothing.

Anyway you are obviously not referring to my post as I am quite thick and I can get plenty of people on these forums to back up the fact that I am stupid.:p:D
 
nobody give a flying toss about power consumption in furmark in either camp, its jut not at all realistic. if you insist on using benchmarks to demonstrate power consumption, then use uningine, where the 980 pulls about 60w less at stock than a 290x. im sure the rest are within a few watts as well.
 
It was done to stop people blowing up GTX 590s but it also effects other cards too.

I did not get a free game with my GTX 590s

I got a free fire extinguisher.:D
 
So imagine like you say if Furmark doesn't allow the cards to max out, yet the 980 is pulling well well well over it's TDP. How much would it pull if Furmark could max it out.

Think about it you thoroughly intelligent people :rolleyes:

The point is if NVidia throttled their card as much AMD do under applications like Furmark then your graphs wouldn't be so close. GTX980 pull a lot of wattage under Furmark because they're allowed to, if NVidia set a hard limit of 250W your Furmark results would show the same power draw differential as in normal usage and gaming outcomes would be no different. In summary, you're using an extreme case which artificially inflates NVidia's power draw compared to the competition to claim similar power draws across the board, despite overwhelming evidence (including Kaaps results) that GTX980 uses significantly less power in normal usage.
 
Last edited:
Power doesn't matter, a 300W card performing identically to a 980 would suck, if it was 30-40% faster, then it wouldn't, simple as that.

An actually 300W TDP HBM 28nm gpu I would guess would be in the region of 30-40% faster than Hawaii and would blow a 980 away, though depends on exactly what technology ended up in it. Sometimes you would backport updates, sometimes you go with the existing tech scaled up.
 
Power doesn't matter, a 300W card performing identically to a 980 would suck,

So power does matter then.

if it was 30-40% faster, then it wouldn't, simple as that.

If it was 30-40% faster, AMD would certainly have a winner on their hands.

An actually 300W TDP HBM 28nm gpu I would guess would be in the region of 30-40% faster than Hawaii and would blow a 980 away, though depends on exactly what technology ended up in it. Sometimes you would backport updates, sometimes you go with the existing tech scaled up.

So you have no idea then :D

Good post though and only a couple of lines :p
 
Starting to regret buying that 520w PSU now :( Thought things were going all eco so that would be all I would ever need lol. It's 4 years old now though, so maybe an upgrade wouldnt be the end of the world.

I'm not really sure what to make of everything thats happened in the GPU world since the 7950 era. I kinda wanna upgrade and go 1440p, but then I keep wondering if single card @ 4k will be a possibility in the near future? God damn PCs!

On a completly unrelated note. Does anyone know how the bandwidth of a HBM equiped card would look agaist a pci3 16x slot? Is it likely to be saturating, or somewhat close to saturating it?
 
ANvidia bend the truth somewhat about their TDP.

Let's look at the GTX 980 TDP '165w' VS 290X TDP '290w' VS reality.

Claimed difference by Nvidia '125w' Actual difference '29w'


Reference 980 can actually use up to 280w..

Its not all they "bend" reality around like showing cards with woodscrews etc...

I look forward what amd has cooked up.
I am hungry so its going to be good.
 
Back
Top Bottom