• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI cuts 6950 allocation

Here's the link to the actual 3DMark scores the german guy posted and everyone is saying the 6970 sucks for: http://3dmark.com/3dm11/136857

First of all the guy is using old drivers (version 8.790.6) whereas the new production drivers included with the yet to be released Catalyst 10.12 is version 8.801.

Second, it says at the top:
Graphics card is not recognized, Graphics driver is not approved.
And then if you scroll down into the graphics card section it says Generic VGA, with Core Clock at 250 MHz and Memory Clock at 150 MHz with 1MB of memory. Come on, 3DMark wasn't even benching it at it's full clock speed 880Mhz. (eg: look here see Radeon HD 5870 for reference and see that it's clock freq goes up when benching: http://3dmark.com/3dm11/136856)

And that's probably because the german guy didn't even have the latest drivers that correctly changes the clock frequency for the 6900 series.

Leaks of the Catalyst 10.12 betas (that add support for the 6900 and unlock its full power) are out on Megaupload. Does the fact that the german guy doesn't want to use it and that benchmarkreviews.net wipe out all comments containing links to it seem awkward to you?
 
Last edited:
First of all the guy is using old drivers (version 8.790.6) whereas the new production drivers included with the yet to be released Catalyst 10.12 is version 8.801.
I'd love it if the drivers made a large difference but I doubt they will



Second, it says at the top:
And then if you scroll down into the graphics card section it says Generic VGA, with Core Clock at 250 MHz and Memory Clock at 150 MHz with 1MB of memory. Come on, 3DMark wasn't even benching it at it's full clock speed 880Mhz.
That's because at the time of posting results the card is back to 2d clocks.
 
That's because at the time of posting results the card is back to 2d clocks.
When the GPU load increases (like when you're playing a game or running a benchmark), the operating system detects that and will tell the graphics driver to raise the clock freq up. If I'm not mistaken, 3DMark always posts the max clock freq the driver allows, well, because it's benching the GPU and thus putting a load on it. 3DMark will read the core and mem frequency while it's benching. Otherwise if it can't read the clock freq from the driver, it'll say 0MHz (eg: http://3dmark.com/3dm11/136849)

It's more likely because the german guy didn't have proper drivers, the clock freq didn't scale up from it's default power saving frequency. Current drivers right now probably don't work correctly with the 6900 series.

IIRC, the same thing happened with the Radeon HD 4850 where only the reviewers got the correct drivers.
 
Last edited:
how about this 5870 3dmark11 result though ? again only showing 2D clocks ... it probably is driver related - but since that 5870 result was as expected - then those clock speeds weren't actually used during the test.

http://3dmark.com/3dm11/133302

Yeah and look at it's FPS in the graphics tests (30 - 60 FPS) when compared to the german guys 6970 benchmarks where it's showing less than 10 FPS. There's no way that can be right unless the the driver is not scaling up the 6970s clock freq. It's pretty easy to see something is wrong with the driver (german guy is not using the right drivers).
 
Last edited:
GoodStuff .... just realised though .... the one I posted was 3 5870s in crossfire ...

so can't compare the FPS in the graphics tests ...

does go to show the driver can't always read the right memory/gpu speeds though
 
130253tz0ffiyt7tfec8ee.jpg
 
I'd be interested in hearing gibbo's opinions on these results, these results certainly don't relate at all to what he was saying earlier on
 
I think it's all fishy, could nvidia have someone make all this up ?
The nvidia icons always on display seem fishy to me

I'd be interested in hearing gibbo's opinions on these results, these results certainly don't relate at all to what he was saying earlier on

Unless they are going to be less than £275
 
Last edited:
I've read that the 480 is undervolted to 0.963v. What's the default of the 480?


1.025 is the average voltage for a stock 480. Don't think the under volting which he has done will not have a major impact on power draw.

@ GJ02

Best to wait for the official reviews before jumping to any conclusions.
 
Back
Top Bottom