• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Polaris refresh?

If a Polaris refresh happens so soon that would surely be indicative of an AMD rush job on the first Polaris cards? :D

Definitely, They needed to get something out the door so they did.

Are you going off that same golden sample Jayztwocents was given?
It'll be interesting to see how others fare with the same card.
 
Definitely, They needed to get something out the door so they did.

It'll be interesting to see how others fare with the same card.

Agree. My comment was a bit tongue in cheek as I've read so many others talking about the Nvidia 10 series being a "rush job". Yeah, right......

Looking forward to early next year when hopefully new cards from both sides, a least a refresh anyway
 
Last edited:
Not sure its a rush job as such if what I think is the case is true AMD were forced to make a choice of making the best of what the process could do or waiting the ~6 months of development it would take to work out a certain issue with 14nm.
 
If a Polaris refresh happens so soon that would surely be indicative of an AMD rush job on the first Polaris cards? :D

Point the finger to Nvidia mate and the rush job of the 1080, with the power & voltage limiters imposed to it also.
No other chip EVER started throttling from 32C

Not to AMD.
 
Point the finger to Nvidia mate and the rush job of the 1080, with the power & voltage limiters imposed to it also.
No other chip EVER started throttling from 32C

Not to AMD.

Part of the problem there is that the chip is pushing way past the kind of clockspeeds 16nm FF+ was expected to support - ostensibly anything over ~1.86GHz is expecting a lot at ~300mm2.
 
Point the finger to Nvidia mate and the rush job of the 1080, with the power & voltage limiters imposed to it also.
No other chip EVER started throttling from 32C

Not to AMD.

Eh?

Sure there seems to be some voltage/power limit causing a brick wall for clocks around 2114 MHz, no matter the board/manufacturer. But what are you smoking saying it throttles from 32C?

The standard throttle temp is ~82C, and upping the power limit increases that to ~92C.
 
Not sure where 32C came from but some people have shown better results if you can keep it below 42C.

For some reason my 1070 always thermal throttles once it goes over ~80C even if I increase the power/temp target but with a slight voltage bump can run mostly ~2100MHz with the odd drop to 202xMHz due to design power.
 
Part of the problem there is that the chip is pushing way past the kind of clockspeeds 16nm FF+ was expected to support - ostensibly anything over ~1.86GHz is expecting a lot at ~300mm2.

But in away it worked out for Nvidia it made AMD push the 470/80 away over the sweet spot and that is why they use so much power.

added- I think Nvidia took AMD by surprise clocking so high.
 
But in away it worked out for Nvidia it made AMD push the 470/80 away over the sweet spot and that is why they use so much power.

added- I think Nvidia took AMD by surprise clocking so high.

As much as anything I think it is that they hit the same problem TSMC did at an earlier stage of development where as you approach 300mm2 for some reason power efficiency, etc. was going out the window - I'm convinced given all the aspects of the 480 that they hit the same kind of thing with 14nm but being behind TSMC (despite claims to the otherwise by certain people) were forced to run with it or delay a long time - once that has been progressed past should see a lot better results and much higher clock speeds coming out of AMD parts.
 
I have a feeling that they based early estimates off Samsung not GF and paid the price for it so to speak thinking it would be about the same stage as Samsung.
 
Samsung introduced a new variant (can't remember what combination of L, P and/or C, H or E letters it is off the top of my head) to address concerns from those making bigger high performance parts too late for AMD to have it running for the RX480 as well which doesn't help.

EDIT: Oh its still called LPP at Samsung - GF apparently has it unofficially named "LPP+".
 
Last edited:
Not sure where 32C came from but some people have shown better results if you can keep it below 42C.

For some reason my 1070 always thermal throttles once it goes over ~80C even if I increase the power/temp target but with a slight voltage bump can run mostly ~2100MHz with the odd drop to 202xMHz due to design power.

Yea you can't change it on Nvidia cards very easily even though most cards are rated for 90c. On AMD you can with a driver option, which is nice.
 
It's more than a "golden sample" Getting lucky with the silicon lottery doesn't normally yield 40-50 watts less power usage.

Samsung made some adjustments for high performance use after GF started working with their 14nm process - looks like GF now either has that or produced their own equivalent unofficially badged as "LPP+".
 
It's more than a "golden sample" Getting lucky with the silicon lottery doesn't normally yield 40-50 watts less power usage.

Samsung made some adjustments for high performance use after GF started working with their 14nm process - looks like GF now either has that or produced their own equivalent unofficially badged as "LPP+".

Yeah but isn't that readout from JayZ's video the GPU die power only? i.e. no memory/memory controller.

So a GTX 1080 reading the same thing would say ~120W.

Same reason why AMD has occasionally quoted the Polaris 10 power consumption as 110W, as opposed to the 'typical board power' being 150W.
 
I believe this is a misunderstanding of Afterburner's readout.

As far as I know that number is the Watts drawn by the GPU die only, so no memory/memory controller power draw.

AMD have also quoted the RX 480 as drawing 110W a couple of times in a slightly cheaty way, because that's what the GPU itself draws.

The point is this 480 is pulling 40 Watts less than others, even with a mild factory overclock.
 
The point is this 480 is pulling 40 Watts less than others, even with a mild factory overclock.

Is it though? I cannot find a Jayz video where he's tracking the same reading on a reference (or any other) RX 480. So there's no identical comparison.

What I'm saying is if I'm interpreting that reading correctly then 100-110W is what you'd expect from a normal RX 480. So that XFX card isn't consuming less.
 
Back
Top Bottom