• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ATI cuts 6950 allocation

6970%20expose.jpg


but again a fakeslide :(
 
Depends on the game, I've seen it 30% faster in one title and 40% faster in a benchmark, overall it looks to be anything from 15%-30% faster on average.
 
Apparently so.

Heres to AMD bringing out a 580 killer and bringing some normality back to high end pricing.
toast.gif

There was one screen shot shown, of I think a supplier/distributor specs page, lots of things blanked out but the memory had more bits blanked out on the 6950 specs, which leads me to think the 6950 will have a 1gb model, and the 6970 will be 2gb, will there be a 2gb 6950 model, at some point, probably, at launch maybe not, we'll have to see.

Thing is, at the moment the only place 2gb shows any real difference is eyefinity, even then it rarely if ever makes a real difference purely because the few times it does, performance is so pathetic at a setting that pushes that amount of memory, that 5fps average of 15fps, neither are playable and both give say 45fps average at the next resolution down.

If the 6950 isn't too far cut down, and comes in noticeably cheaper then I think thats the card for me, 2gb is for e-peen, ok those that actually have 3 screens and play games with them, fair play, everyone else, worthless.

I didn't quote another of your posts but you said refresh done right.

Just to be accurate, Nvidia = refresh, and a pee poor one at that, if it wasn't for being underclocked and missing shaders due to their own incompetance, sure the 480gtx would have been another 15% faster, but the 580gtx simply wouldn't exist.

AMD = new generation, a proper one, and well, its better than I thought(seemingly) by 20-30% but a large amount of that is down to some fairly bad inefficiency on the "last gen" 5870.

If they'd had "barts" efficiency, and a 336mm2 Barts 1600 shader part to start with, well, it would be pretty awesome as it was(probably 15% under the 580gtx) and with 20% more shaders now + better shaders, the 35-40% boost you would expect.
 
^^^ ^^^so it's 15% quicker in amd's own picked tests.

Not bad, we will see when true results come out.

Could the graph be wrong?

Yeah, its fake, but if theres even a hint of accuracy in any of the results, its not AMD picked out best results, LP2, Hawx, Dirt 2, those three are Nvidia's biggest wins.

Can't remember who usually is faster in the other benchmarks going to be honest, not sure the rest are all "AMD victories" either, some are, some aren't I think.

Fake or not, calling it AMD biased numbers, with LP2 in there(go look up LP2 numbers on Nvidia's slides, where they were showing some 60% faster than a 5870 or something nuts) is just madness.
 
Yeah, its fake, but if theres even a hint of accuracy in any of the results, its not AMD picked out best results, LP2, Hawx, Dirt 2, those three are Nvidia's biggest wins.

Can't remember who usually is faster in the other benchmarks going to be honest, not sure the rest are all "AMD victories" either, some are, some aren't I think.

Fake or not, calling it AMD biased numbers, with LP2 in there(go look up LP2 numbers on Nvidia's slides, where they were showing some 60% faster than a 5870 or something nuts) is just madness.

They are hand picked to show the best scaling. We don't know the resolutions / lack of AA and AF is concerning.

The only thing that's consistant and seminmesurable is vantage. Which let's face it is 0.001% by the graph. If that's true this card will not set the world alight or be even close to the legendary r300.
 
Last edited:
They are hand picked to show the best scaling. We don't know the resolutions / lack of AA and AF is concerning.

The only thing that's consistant and seminmesurable is vantage. Which let's face it is 0.001% by the graph. If that's true this card will not set the world alight or be even close to the legendary r300.

Vantage is hardly representative of a cards performance though. 3D Mark numbers lost all credibility a long time ago.
 
There was one screen shot shown, of I think a supplier/distributor specs page, lots of things blanked out but the memory had more bits blanked out on the 6950 specs, which leads me to think the 6950 will have a 1gb model, and the 6970 will be 2gb, will there be a 2gb 6950 model, at some point, probably, at launch maybe not, we'll have to see.

Thing is, at the moment the only place 2gb shows any real difference is eyefinity, even then it rarely if ever makes a real difference purely because the few times it does, performance is so pathetic at a setting that pushes that amount of memory, that 5fps average of 15fps, neither are playable and both give say 45fps average at the next resolution down.

If the 6950 isn't too far cut down, and comes in noticeably cheaper then I think thats the card for me, 2gb is for e-peen, ok those that actually have 3 screens and play games with them, fair play, everyone else, worthless.

I didn't quote another of your posts but you said refresh done right.

Just to be accurate, Nvidia = refresh, and a pee poor one at that, if it wasn't for being underclocked and missing shaders due to their own incompetance, sure the 480gtx would have been another 15% faster, but the 580gtx simply wouldn't exist.

AMD = new generation, a proper one, and well, its better than I thought(seemingly) by 20-30% but a large amount of that is down to some fairly bad inefficiency on the "last gen" 5870.

If they'd had "barts" efficiency, and a 336mm2 Barts 1600 shader part to start with, well, it would be pretty awesome as it was(probably 15% under the 580gtx) and with 20% more shaders now + better shaders, the 35-40% boost you would expect.

Damn DM, Was was really looking forward to getting my eyefinity set up, I game at 1080 so this will be poor perfomance still even with the 2GB
 
when is the earliest we can expect to see actual benchmarks? will some reviewers not have got their hands on these prior to wednesday release?
 
when is the earliest we can expect to see actual benchmarks? will some reviewers not have got their hands on these prior to wednesday release?

People have them now, a guy on another forum has a pair of cards and is teasing the life out of everyone.

ATI will have to announce the cards, and give the all clear before we get the official benchmarks.
 
So RavenXXX2 you still think the 6970 will beat the 580 by 30 to 40 percent??? I hope you realize that you may have influanced a lot of people, and a lot of money, becasue of what you said . I hope it was for the better (6970>580) but more than likely it was for the worst :(

What has Gibbo said which makes you think it isn't? He said they are going to be competitively priced on ocuk on launch day and then go up in price.

Last time that happened it was because ATI had a monster on theit hands.
 
I hope they haven't set the stock clocks too high to get this predicted performance. I like to be able to clock the nuts off em and get a load more. :D
 
Back
Top Bottom