Duff-Man;17866744This should bring better "performance per stream processor" on average said:
If the 1920SP Cayman part uses less than 150W, under normal gaming situations at 100% GPU load (so 2560res in a decent game, but no need for Furmark), then I will buy one for you[/b]
Agree with everything but the will not improve power per shader, it won't inherantly do so, but won't necessarily "not" do that either, weird sentence.
We don't know the design, the transistor designed to be used for it, it could easily use less power per shader, or more, its almost impossible to say. Theres a choice of a few different types of transistor in any process, it could be the "5th" trancendental shader in the 5d setup actually requires simply due to spacing, design, something, to use a higher leakage shader, or it might not. The 4d cluster could also require the use of a more leaky transistor.
The 4d cluster uses less power than 5d cluster, purely because if its saving 10% die area, you're almost certainly using less power. But if a 5d cluster uses 1 watt, each shader might use 0.2W, while a 4D cluster using 0.9w's might use
0.225w each, we really don't know. Better design, less via's, less power for the interconnects.
In reality I'd expect power per shader to stay fairly static, but performance per watt will go up quite a bit. I'd expect Barts with the same performance as Cayman to use more power, but it would also use WAY more than 150W.
AS for the last bit, I think I'm going to have to just insist the 6970 will use 150W, just incase AMD pull a bait and switch and it actually ends up on the 28nm process, because they'll you'll have to buy me one too?
Jigger and Psychas are mad, Raven's made no bold claims, he's not trolling, the idea the 6970 will use 150W is madness, you would literally have to have the 6970 as a 28nm core with GloFo/AMD pulling off the most mindblowing, secretive and brilliant launch of any tech ever, I'm going to give that a 0.0002% chance of happening.