• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon RX 480 "Polaris" Launched at $199

Physics, you use one 1 watt of energy its converted into 1 watt of heat.

TDP is a measure of the recommended cooling solution to dissipate that energy, a TDP rating is typically higher than the energy (watts) used.

With a TDP of 300 Watts 'for example' the add in board partner knows to use a cooler with the minimum heat dissipation power of 300 watts, or 600 watts if he wants it to be really good at it.

The TDP is related to power because its a direct conversion, a TDP rating however is not the power consumption as that can be set as desired independently of power consumption.

Thermal Design Power

Thermal Design

Indeed its very simple.
 
Last edited:
But that's assuming they've set the TDP at the lowest value they can.

You acknowledged that the cards power consumption was 275 Watts, and by thermodynamics wattage in is the heat produced. So either AMD has lied about the typical power consumption or the TDP it has isn't actually the absolute minimum amount of thermal dissipation needed.

Unless you're defining a 375 watt card to be a card with a TDP of 375.

You're thinking that TDP means "heat given out". It doesn't. It seems like it should but it doesn't. Watch that Linus video earlier. TDP factors in expected usage, maximum power draw, minimum power draw... And different companies have different ways of calculating it. Intel's TDP figures for their chips are calculated differently from AMD's TDP figures for example and cannot be directly compared. You think they could because you think it just means what size cooler do I need, but you can't because they're not real numbers - they're projections. For example, here is what Nvidia gave out as the figures for their GeForce 9800 (going back a bit but it will serve as an example!).

GeForce 9800 GX2 (default clocks)
– TDP: 197 Watts
– IDLE: 191 Watts
– LOAD: 390 Watts

Note how it could be drawing 390 Watts, but the TDP which is a descriptor of the chips behaviour is 197 Watts. At another point, it might only be drawing 191 Watts, but the chip would still have a TPD of 197 Watts. That's because, as Humbug has been saying, TDP isn't a measurement. It's a rating given to the chip by its manufacturer and calculated to a formula created by themselves (and which differs between manufacturers).

That's why D.P.s attempt to make pronouncements about power draw based on it was flawed. You can infer some things and make some educated guesses to an extent. But you can't use it to declare Humbug wrong. You probably shouldn't even use it at all for an argument like this.
 
Last edited:
I provided a link that showed the FuryX pulling 300w, higher than the 275w figure that AMD states. The same link showed the 290x pulling 286w, well over AMD's 250 published figure.

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,10.html

To take it back to what we are actually talking about here.

The 290X has 300 watts of power inlet
The Fury-X has 375 Watts of inlet

As to the question of whether or not AMD tend to use more power than their TDP rating 'your words', they don't, we don't really know what AMD TDP rate their GPU's at, they don't tend to publish that info, its pretty usless as i explained.

So, your original argument, to put it more accurately, was that AMD use more power than they say they do, right?
And there in you argued that the Polaris 10 TDP < forget about that, lets just use the real measure, power consumption, would be more than 150 Watts as AMD also underrate TDP / Power consumption, call it what you will... right?

Well, i reiterate the Fury-X has power inlets of 375 Watts and uses less than 300 according to your own links.
290 power inlets 300 Watts and again uses less...

Now, Polaris 10, power inlets 150 watts, so the evidence, your evidence suggests an actual power consumption less than that. not as you argued (while not understanding the difference between TDP and power consumption, which also being completely irrelevant to your argument < proof you never understood it at all) more.

Right?

Fury-X power 375 Watt, uses less
290X power 300 Watts, uses less
Polaris 10 power 150 Watts, uses less.

All soooooo simple.
 
Last edited:
On a slightly different note can anyone tell what exactly 300 watts of thermal dissipation actually is?

That is to say, it certainly won't keep the temperature of something producing 300 watts of heat at room temperature,and even then external factors may affect thus, so I'm wondering what it means in practice.
 
You're thinking that TDP means "heat given out". It doesn't. It seems like it should but it doesn't. Watch that Linus video earlier. TDP factors in expected usage, maximum power draw, minimum power draw... And different companies have different ways of calculating it. Intel's TDP figures for their chips are calculated differently from AMD's TDP figures for example and cannot be directly compared. You think they could because you think it just means what size cooler do I need, but you can't because they're not real numbers - they're projections. For example, here is what Nvidia gave out as the figures for their GeForce 9800 (going back a bit but it will serve as an example!).

GeForce 9800 GX2 (default clocks)
– TDP: 197 Watts
– IDLE: 191 Watts
– LOAD: 390 Watts

Note how it could be drawing 390 Watts, but the TDP which is a descriptor of the chips behaviour is 197 Watts. At another point, it might only be drawing 191 Watts, but the chip would still have a TPD of 197 Watts. That's because, as Humbug has been saying, TDP isn't a measurement. It's a rating given to the chip by its manufacturer and calculated to a formula created by themselves (and which differs between manufacturers).

That's why D.P.s attempt to make pronouncements about power draw based on it was flawed. You can infer some things and make some educated guesses to an extent. But you can't use it to declare Humbug wrong. You probably shouldn't even use it at all for an argument like this.

Indeed you could have at 200 watts card with a 600W TPD just because they want it too run super cool even though a 250w TPD would still do the job but not as cool.
 
Ah thanks

Oh I realised that TDP was a calculation given by the Manufacturer. Though I was (wrongly) working on the assumption that a cards TDP was higher than its peak power consumption + whatever amount of headroom they wanted to give which is why I didn't agree that the Fury X was a 375 watt card.

I agree that a 150 watt TDP doesn't mean that 480 will use 150 watts but then I guess I can't say what it will use.
 
You're thinking that TDP means "heat given out". It doesn't. It seems like it should but it doesn't. Watch that Linus video earlier. TDP factors in expected usage, maximum power draw, minimum power draw... And different companies have different ways of calculating it. Intel's TDP figures for their chips are calculated differently from AMD's TDP figures for example and cannot be directly compared. You think they could because you think it just means what size cooler do I need, but you can't because they're not real numbers - they're projections. For example, here is what Nvidia gave out as the figures for their GeForce 9800 (going back a bit but it will serve as an example!).

GeForce 9800 GX2 (default clocks)
– TDP: 197 Watts
– IDLE: 191 Watts
– LOAD: 390 Watts

Note how it could be drawing 390 Watts, but the TDP which is a descriptor of the chips behaviour is 197 Watts. At another point, it might only be drawing 191 Watts, but the chip would still have a TPD of 197 Watts. That's because, as Humbug has been saying, TDP isn't a measurement. It's a rating given to the chip by its manufacturer and calculated to a formula created by themselves (and which differs between manufacturers).

That's why D.P.s attempt to make pronouncements about power draw based on it was flawed. You can infer some things and make some educated guesses to an extent. But you can't use it to declare Humbug wrong. You probably shouldn't even use it at all for an argument like this.



The problem is that a lot of review sites uses TDP and power draw interchangeably even although there are important difference.


The bottom line is AMD has an official power draw of the 290x as 250, the FuryX as 275w. Both cards frequently pull significantly more when under load.
 
Ah thanks

Oh I realised that TDP was a calculation given by the Manufacturer. Though I was (wrongly) working on the assumption that a cards TDP was higher than its peak power consumption + whatever amount of headroom they wanted to give which is why I didn't agree that the Fury X was a 375 watt card.

I agree that a 150 watt TDP doesn't mean that 480 will use 150 watts but then I guess I can't say what it will use.

Peak power is higher than TDP, typically by a factor of 1.5x.
 
The HD7850 2GB was rated upto 150W,with a board power of 130W and the card consumed under 100W on average and at peak:

https://tpucdn.com/reviews/AMD/HD_7850_HD_7870/images/power_average.gif
https://tpucdn.com/reviews/AMD/HD_7850_HD_7870/images/power_peak.gif

p8WHLBZ.gif

mOnXYEf.gif


The HD7870 had two six pin power connectors and consumed under 120w on average and at peak,and was rated for a board power of 175W.

Board power rating and TDP are not the same as power consumption.

In fact we have no clue what the the TDP of the RX480 is.

9WARQmA.jpg


That 150W figure is the board power rating,ie,what the board is rated to deliver,like the 130W figure for the HD7850 and the 175W figure for the HD7870.

This takes into consideration the extra power for pre-overclocked versions and user overclocks.
 
Last edited:
On a slightly different note can anyone tell what exactly 300 watts of thermal dissipation actually is?

Not sure if you're asking someone to define it, or if you just want an idea of how much it is.

If you're after a definition then on the understanding that we're now talking about actual thermal dissipation rather than TDP which is a manufacturer's arcane estimate, yes - we can answer. :)

300 watts of thermal dissipation is what it sounds like. It's the amount of heat transferred. Watts is Joules (a unit of energy) per second. So 300 Watts means transferring 300 Joules of energy every second. Which is a lot, by the way.

TDP you can think of as a company's estimate of their Thermal Dissipation under the circumstances they expect it to run. Same units (joules per second) but one is an actual measurement, the other a chip rating.

That is to say, it certainly won't keep the temperature of something producing 300 watts of heat at room temperature,and even then external factors may affect thus, so I'm wondering what it means in practice.

Okay, now if you want to know how much 300W is in real terms... Hard to find good figures for thermal dissipation different coolers can provide. Here is a very beefy example though:

http://koolance.com/ex2-755-exos-2-v2-liquid-cooling-system-aluminum

Says that it can dissipate 590 Watts which is a huge amount. The part about 25C ambient delta is because cooling varies according to the environment it runs in (as you would imagine). I.e. if that were trying to dissipate the heat into a hot room, it would transfer less than in a cold room.

Not sure if all that answers your question but I hope it helps.
 
Going from my previous post,here are the equivalent slides for the HD7850 and HD7870:

http://i.imgur.com/gm1noKo.jpg

gm1noKo.jpg


TPU uses some decent equipment to measure card power consumption - the reference HD7850 and HD7870 didn't get close to their rated board power anyway.

Even at peak,the HD7850 and HD7870 were 30W to 60W under their maximum rated board power.
 
Last edited:
The problem is that a lot of review sites uses TDP and power draw interchangeably even although there are important difference.

No, the problem is that you didn't understand the difference and were trying to make assertions about AMD's cards based on it. What you have given above is just the explanation of why you didn't know you were wrong.

The bottom line is AMD has an official power draw of the 290x as 250, the FuryX as 275w. Both cards frequently pull significantly more when under load.

Again, no. The bottom line is that, like tavtavtav said, we simply don't know. What you've done above is AGAIN shown a lack of understanding of TDP by talking about "official power draw" and then saying "both cards frequently pull significantly more". Yes. Pretty much all cards frequently pull more than their TDP. That's how it works. TDP is not shorthand for "maximum possible draw".

You contrive to make things sound vaguely bad over and over without ever actually considering (or allowing) context. Yes, you can say "both 290 and FuryX frequently pull significantly more when under load". But you could equally accurately say "all GPUs frequently pull more when under load". Your post comes across as an attempt to fast-talk people who aren't familiar with the terms into hearing "these cards did something other than normal".

I'm really surprised to see you still arguing this one.

EDIT: Also, what CAT said. :)
 
No, the problem is that you didn't understand the difference and were trying to make assertions about AMD's cards based on it. What you have given above is just the explanation of why you didn't know you were wrong.

No, I understand the difference but in common parlance they are used interchangeable again,y in GPU reviews. I'm not making any assertions, I am simply pointing out the facts that some posters conveniently get wrong.

Again, no. The bottom line is that, like tavtavtav said, we simply don't know. What you've done above is AGAIN shown a lack of understanding of TDP by talking about "official power draw" and then saying "both cards frequently pull significantly more". Yes. Pretty much all cards frequently pull more than their TDP. That's how it works. TDP is not shorthand for "maximum possible draw".

I have repeatedly said that peak power draw is significantly higher than TDP, or are you just being selective in your reading?



The bottom line is what power cos utmoion AMD states is few tly less than the real world usage, that is a fact. You can argue over semantic until the cows come home but the facts speak for themselves.

You contrive to make things sound vaguely bad over and over without ever actually considering (or allowing) context. Yes, you can say "both 290 and FuryX frequently pull significantly more when under load". But you could equally accurately say "all GPUs frequently pull more when under load". Your post comes across as an attempt to fast-talk people who aren't familiar with the terms into hearing "these cards did something other than normal".
I'm not try to make anything bad, imprinting out someone was wrong with basic facts. AMD make certain claims, real world testing shows different results. There is no discussion there. I'm not talking about artificial load test like flu ark, but real world gaming.


I'm really surprised to see you still arguing this one.

EDIT: Also, what CAT said. :)


I'm surprised you still don't understand the entire point of the debate.



Edit: I don't want to be argumentative here and I fully admit that I have intermixed TDP and average board power which may lead to confusion. The point I am making is incredibly simply and backed up by 3rd party evidence.
 
Last edited:
Just to add some real info to loosen up the AMD bashing.
This is an article from March with some interesting bits:
http://www.anandtech.com/show/10145/amd-unveils-gpu-architecture-roadmap-after-polaris-comes-vega

"Meanwhile AMD has also confirmed the number of GPUs in the Vega stack and their names. We’ll be seeing a Vega 10 and a Vega 11. This follows Polaris GPU naming – which has finally been confirmed – with Polaris 10 and Polaris 11. I have also been told that Polaris 11 is the smaller of the Polaris GPUs, so at this point it’s reasonable to assume the same for Vega."
 
No, I understand the difference but in common parlance they are used interchangeable again,y in GPU reviews. I'm not making any assertions, I am simply pointing out the facts that some posters conveniently get wrong.

In common parlance I've heard people call the PC case and everything in it "the hard drive". Would that make them right, would it make it appropriate to use that terminology on a technical forum like OCUK and argue till you're blue in the face that other people should accept an inaccurate term because some others have misused it; perhaps most importantly of all, would you keep defending an invalid point by trying to assert we should use an incorrect definition? But even to argue that much with you is to validate an error on your part. It doesn't matter how many people misunderstand what TDP is (and I reject the idea that there's any "common parlance" on a phrase half a percent of the English speaking world would even recognise), it doesn't make your statements correct. You can't say X implies Y and then when it's pointed out that you don't understand X claim "well most people don't understand X so Y is still implied". We're not arguing over how many people don't know what TDP is, we're arguing over whether you are right to conflate it with power draw and you are not. That is demonstrated. Actual manufacturer figures show them as different things.

As to "conveniently forgetting", the only person here to whom it is convenient to misinterpret TDP as power draw is yourself because it supports your argument.

I find you doubling-down on a mistake when called on it worse than if you just admitted it. You do more and more damage to your credibility the longer you persist.

I have repeatedly said that peak power draw is significantly higher than TDP, or are you just being selective in your reading?

A moment ago you were defending the usage of TDP for power as acceptable because of "common parlance". No, I've read all of your posts in the last dozen pages or so. You have made basic errors that cannot be disguised as "using the terms interchangeably". You said that power draw was directly related to TDP. It is not. This is demonstrated. Your statements are statements that could only be made by someone who did not understand what TDP was or was deliberately lying to misrepresent something.

When challenged you repeatedly fall back on generic statements superficially related but not actually carrying your point. So peak power draw is significantly higher than TDP... and? I could make vague and true statements and pretend I was actually advancing an argument as well, but I don't. There's no argument you are making with the above. It's just a factoid held up in response to someone who is correcting you. Any second now, I have the feeling you'll tell me that Chewbacca was a wookie and he lived on Endor.


The bottom line is what power cos utmoion AMD states is few tly less than the real world usage, that is a fact. You can argue over semantic until the cows come home but the facts speak for themselves.

Huh? Making my best guess at what you're trying to say here I refer you to my earlier post about how you post weirdly selective statements to imply failure or wrongdoing. You state that AMD's TDP are less than real world usage. Well firstly, I'd like to see this TDP because AMD rarely publish them as far as I'm aware. So a few specific examples of the TDPs AMD have given will do. Any relatively recent AMD GPUs are fine by me. We'll all be waiting for you to back up your comments here. Secondly, and much more importantly, what you say is true of all GPUs (and CPUs as well) if you're "real world usage" is peak power drawer. Yet again, you make some vaguely critical sounding comments about AMD but they're actually just general statements about GPUs.

I'm not try to make anything bad, imprinting out someone was wrong with basic facts. AMD make certain claims, real world testing shows different results. There is no discussion there. I'm not talking about artificial load test like flu ark, but real world gaming.

Wonderful. Then if you had understood what TDP is you would never have brought it up. And again, please post some statements from AMD about what power draw they pull under "real world gaming" (nice vague term you just inserted into the argument there, by the way. Gives you lots of wiggle room to back out of this debacle, doesn't it?). Please, go ahead - I want to see AMD's statements on how much power the 290 is supposed to draw under "real world gaming" so you can show how they misrepresented their cards. And I'll assume that as you claim to have known all along what TDP is, you wont just be quoting that back at us anymore.
 
When's this thing landing in stores, 29th?

Looking to build a budget gaming rig with it being able to run witcher 1080p 60fps, probably grab a used z87 board and i5/i7 chip as seems to be the best bang for your buck second hand option. Sub £450 in total, already have monitor and accessories.

I had a 970 before and that ran it fine with a few things turned down so assuiming this 480rx will be a good bit better?
 
@ h4rm0ny, as i'm sure you're well aware he used that whole TDP argument completely misunderstanding the whole concept by genuine lack of knowledge or deliberately as a way to add weight to his assertion that AMD's cited 150 Watts for Polaris 10 'Power range' as a 'typical AMD under estimation' of all their GPU's powerdraw so as to use that as a form of 'proof' that AMD's new architecture is inefficient. I.E over 150 Watts.

However, there is one completely fundamental flaw in his argument besides his ill judged TDP adventure.

He proved with his own links that there is one universal truth to all of this..

In general GPU's, including AMD's draw less power than their board power design, of course.

The board power design of the Fury-X is 375 Watts, his links show they use less than 300, the 290 board power design is 300 watts, that card also has a power draw less than 300 watts.

The Polaris 10 GPU has a board power design of 150 watts.

The TDP argument is completely irrelevant, it never was, its a strange argument to have used to begin with, aside from it being utterly wrong its also a flawed abstract argument against a fundamental argument, it was lost as soon as it was made. :)
 
Last edited:
In common parlance I've heard people call the PC case and everything in it "the hard drive". Would that make them right, would it make it appropriate to use that terminology on a technical forum like OCUK and argue till you're blue in the face that other people should accept an inaccurate term because some others have misused it; perhaps most importantly of all, would you keep defending an invalid point by trying to assert we should use an incorrect definition? But even to argue that much with you is to validate an error on your part. It doesn't matter how many people misunderstand what TDP is (and I reject the idea that there's any "common parlance" on a phrase half a percent of the English speaking world would even recognise), it doesn't make your statements correct. You can't say X implies Y and then when it's pointed out that you don't understand X claim "well most people don't understand X so Y is still implied". We're not arguing over how many people don't know what TDP is, we're arguing over whether you are right to conflate it with power draw and you are not. That is demonstrated. Actual manufacturer figures show them as different things.

As to "conveniently forgetting", the only person here to whom it is convenient to misinterpret TDP as power draw is yourself because it supports your argument.

I find you doubling-down on a mistake when called on it worse than if you just admitted it. You do more and more damage to your credibility the longer you persist.



A moment ago you were defending the usage of TDP for power as acceptable because of "common parlance". No, I've read all of your posts in the last dozen pages or so. You have made basic errors that cannot be disguised as "using the terms interchangeably". You said that power draw was directly related to TDP. It is not. This is demonstrated. Your statements are statements that could only be made by someone who did not understand what TDP was or was deliberately lying to misrepresent something.

When challenged you repeatedly fall back on generic statements superficially related but not actually carrying your point. So peak power draw is significantly higher than TDP... and? I could make vague and true statements and pretend I was actually advancing an argument as well, but I don't. There's no argument you are making with the above. It's just a factoid held up in response to someone who is correcting you. Any second now, I have the feeling you'll tell me that Chewbacca was a wookie and he lived on Endor.




Huh? Making my best guess at what you're trying to say here I refer you to my earlier post about how you post weirdly selective statements to imply failure or wrongdoing. You state that AMD's TDP are less than real world usage. Well firstly, I'd like to see this TDP because AMD rarely publish them as far as I'm aware. So a few specific examples of the TDPs AMD have given will do. Any relatively recent AMD GPUs are fine by me. We'll all be waiting for you to back up your comments here. Secondly, and much more importantly, what you say is true of all GPUs (and CPUs as well) if you're "real world usage" is peak power drawer. Yet again, you make some vaguely critical sounding comments about AMD but they're actually just general statements about GPUs.



Wonderful. Then if you had understood what TDP is you would never have brought it up. And again, please post some statements from AMD about what power draw they pull under "real world gaming" (nice vague term you just inserted into the argument there, by the way. Gives you lots of wiggle room to back out of this debacle, doesn't it?). Please, go ahead - I want to see AMD's statements on how much power the 290 is supposed to draw under "real world gaming" so you can show how they misrepresented their cards. And I'll assume that as you claim to have known all along what TDP is, you wont just be quoting that back at us anymore.




Your entire post is utterly ridiculous:rolleyes:
It is quite simple. AMD claim a power consumption for the 290x of 250w, AMD for the FuryX 275w. Both cards actually use 20-30 watts more than stated during gaming. All the evidence has been presented including official AMD documentation from their website.


You can go on and on being anal retentive about language but the facts are the facts.


When it comes to Polaris 10 AMD have publicly stated less than 150w. We won't know know try figures until the reviews.
 
Back
Top Bottom