• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI cuts 6950 allocation

The power thing doesn't quite work the way you're saying

My understanding of this is as follows:

Basically, look at this chart again:
1zqaurt.jpg


What you do is there is a MAX TDP (power draw) for a card limit. You can adjust the slider up/down to set the maximum TDP draw.

At stock, the 6950 has a maximum of 200W draw apparently. In the Perlin Noise synthetic benchmark, the core clock adjusts up and down from 650-800 (average around 700) in order to keep the power draw under the maximum of 200W.

Now, when the power draw is increased 5% to 210W, the core clocks can reach higher MHz because the card is allowed to draw 10 more watts of power, hence the average core clock now shoots up to 750-775MHz. Likewise, the FPS result jumps from 140 to 155, a pretty significant jump in a synthetic benchmark.

Once the card is incresaed 10% to 220W, the core clock maintains itself at 800Mhz the whole bench.

To 'overclock' the card, you still have to set the MHz in CCC - so you can tune it to 850W, and that's the maximum clock the core will reach.

Basically it's like turbo boost for CPU's, but somewhat in reverse.

Basically:
TDP Slider - Sets maximum power draw of the card
Core Clock - Sets maximum core clock of the GPU

Thus the card dynamically adjusts the core clock up to the max core clock as long as the overall power draw of the card is below the TDP slider.

That's why I don't take much stock in the synthetic benchmarks, since the leaked 6970 perlin noise FPS figure is very low compared to the 5870 - in all likelihood, it's showing similar results to the chart above in that it's not consistently at 880 MHz core across the benchmark. Games might not be affected as much, but interesting applications for this would be to downclock the GPU for games that already run at 200FPS to save power when gaming. Now that would be sweet.
 
^^^
It just a feature to stop ppl baking their 6970 in furmark guys.
It's not a 10% free boost button!
Same feature that 580 and 570 have. A bit more refined, but essentially the same.

It DOESNT mean that benchmarks that were done are -10% of their true potential....
Because this feature probably didn't engage in games.
and is specifically aimed at unrealistic synthetic benchmarks...Such as perin noise / furmark and other crud.
 
but interesting applications for this would be to downclock the GPU for games that already run at 200FPS to save power when gaming. Now that would be sweet.

:eek::eek:That feature has been out for years.:eek::eek:

Its called vsync. you switch it on, the fps drops to 60 (usually TFT monitor refresh rate) the gpu usage drops significant amount.

try it, you will be amazed :D
 
^^^
It just a feature to stop ppl baking their 6970 in furmark guys.
It's not a 10% free boost button!
Same feature that 580 and 570 have. A bit more refined, but essentially the same.

It DOESNT mean that benchmarks that were done are -10% of their true potential....
Because this feature probably didn't engage in games.
and is specifically aimed at unrealistic synthetic benchmarks...Such as perin noise / furmark and other crud.

Yes and no, remember thats one graph from a very graphically intense benchmark, how can you be sure that in something like Crysis the power draw isn't much lower and will overclock the thing.

Thing being that, you could overclock your card to 1000Mhz, and usually this would be unsafe, the card simply can't handle the heat of Furmark, and 3dmark and some other things and it won't ever pass the 250 or 270W TDP you set, but in other games it will maintain 1000Mhz fine, potentially even increase it.

Its like now I say why on earth do people overclock based on temps/heat/power/stability Furmark provides, its stupid. YOu might be stuck at 900Mhz on a 5870 to be stable and keep temps safe in Furmark, but in real life, in Crysis, in Metro, in any other game, you can run 1050Mhz with lower temps than Furmark at 950Mhz and a lower voltage.

The problem is, what happens when the gpu bugs out and some program you didn't know put out a Furmark like power draw, and because you've set the overclock to the game stable power draw/temp/voltage, suddenly you've caused your card to crash or heats to get dangerous.


It should also max overclocking easier and more stable, why, because who hasn't found one overclock to be stable in one game, and another to be stable in another game. LIkely due to the varying power draw of games.

Meaning 1050 is stable in crysis, 1100mhz in another game, 950mhz in another ap, 1000Mhz in another game. If you can set your max clock to 1100Mhz, and the TDP load will dynamically scale the gpu up and down according to the load being drawn and maintain stability across the board, it means you find one stable overclock and you shouldn't ever have to downclock that for a tougher game.
 
:eek::eek:That feature has been out for years.:eek::eek:

Its called vsync. you switch it on, the fps drops to 60 (usually TFT monitor refresh rate) the gpu usage drops significant amount.

try it, you will be amazed :D

yes, but vsync has its own internal issues, and triple buffering takes gpu power

http://www.anandtech.com/show/2794/2

Doing it without needing that stuff would be nice, esp. if they put in application profiles
 
What does it do? Easy enough, it swaps the shader count. The 'normal' count is 1920 for a 205W 5970 card, and the switch opens up the full force of all 2520 shaders. The reason this switch is set to 'low' is that between the clock speed advances, 730MHz/1920 shaders to 920MHz/2520 shaders, it blows the card through the 300W cap for PCIe cards.


Why's he call it a 5970?
 
Mate he's joking it's s windup. Some ppl will believe anything...

Don't tell me you seriously believed that rubbish?

What a joke... talk about gullible.

Way to try and point out that someone is being stupid, only to make yourselves look incredibly dense.

Of course drunkenmaster knows that it is a joke article :rolleyes:
 
Last edited:
Back
Top Bottom