• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The Official ATI Radeon HD 6970 & 6950 Reviews, Discussion and Overclocking Thread ***

Based on the results of their own review the HD6970 does "blow the GTX570 out of the water". Whether the review is correct or not depends on how the results can be interpreted. They are not wrong (as the FPS is literally what was achieved) but "could" be constructed in such a way to favour one card over the other. Unfortunately this is pure conjecture.

The point about the GTX580's is a perfectly valid one. For a site that is supposedly biased towards AMD doesn't it strike you as slightly odd that the owner runs two Nvidia cards? He would get the HD69xx free as well so why isn't he running them if they are so biased?

One quote on the site is not enough to start calling them biased especially when that quote is putting into words what the results showed.

Our concern was that AMD would shoot themselves in the foot by pricing the Radeon HD 6970 in particular at too high a price. If we take a straight average at 1920x1200 and 2560x1600, its performance is more or less equal to the GeForce GTX 570.

Anandtech


While the HD 6970 2GB is undoubtedly a good graphics card for around £300, unless you really want EyeFinity support, the GTX 570 is a better bet. The GTX 570 1.3GB handles AA significantly better in the majority of games, and for high-end graphics cards, this matters, and it's also cheaper.

Bit-tech

Unfortunately AMD's Radeon HD 6970 can not meet that price target. With performance comparable to GeForce GTX 570, but a price that is $50 higher it is difficult to justify the investment.

Techpowerup

The upshot of an experimental design and difficulties with ramping up speed on a 2.64bn-transistor die means that the Radeon HD 6970 is between 15-20 per cent faster than the Radeon HD 5870 card it effectively replaces in the AMD performance stack. Tellingly, it's about the same speed as a GeForce GTX 570 and, no matter which way you cut it, slower than a GTX 580.

Hex-s

Looking at it this way, the HD 6970 blows the GTX 570 out of the water for a few bucks.


OCP biased they be.

@ gerard, I thought the FT02 had decent cable management.:D
 
Last edited:
Ocp:-
"For the next review were not going to test these cards at 1080p were going for resolutions such as the old favourite 3280×2048 WQSXGA because this is where ppl play games at......."

facepalm-face-palm-facepalm-demotivational-poster.jpg
 
I've been benching an overclocked 6970 at 949/5896 (using 1.24V via ASUS voltage tweak) but the performance gain is very slight compared to stock performance. So if the GTX570 is on par with a stock HD6970, when both are overclocked, I can see Nvidia taking the lead easily.

The only way I can explain the small performance increase when overclocking is PowerTune kicking in and limiting the clock speed. Whilst voltage tweaking helps improve headroom, it also increases power consumption closer to the PowerTune TDP, rendering it somewhat useless... I remain somewhat unconvinced with the 69XX series.
 
Last edited:
I read the Hardforums a lot, but, this is the first time I have ever seen anyone accusing them of been biased towards AMD before. From reading the forums for years it's was always seen that Hardocp seemed to favour Nvidia over AMD. Just go back to June when they did review of the the 460 sli versus the 5870 sli using the 10.6 drivers, and the 460's won. Everyone that knows anything about hardware knew that the 10.6 broke crossfire scaling yet Kyle refused to do the review with the 10.5 drivers that would have had the 5870 crossfire winning. AMD fanboys were all up in arms over that.

And now some of you claim he is biased towards AMD, lol, what a joke. Back when the 4xx series was released from Nvidia they were one of the few sites that recommended them, gave a silver reward if I remember correctly and then followed that up a month or two later when optimised drivers came out for the 4xx series that really improved performance. Hard stated that at the time that the 480sli gave the best gaming experience bar none. Yeap he really sounds like He is AMD biased there alright.

And, he is using Nvidia cards in his own machine? AMD biased? just because He gives a review that shows the AMD cards doing well? lol, whatever guys.

And just one last thing, Just like the Nvidia 4xx series was released without proper drivers, the 69xx series are still waiting drivers that fully support them.
 
Last edited:
I've been benching an overclocked 6970 at 949/5896 (using 1.24V via ASUS voltage tweak) but the performance gain is very slight compared to stock performance. So if the GTX570 is on par with a stock HD6970, when both are overclocked, I can see Nvidia taking the lead easily.

The only way I can explain the small performance increase when overclocking is PowerTune kicking in and limiting the clock speed. Whilst voltage tweaking helps improve headroom, it also increases power consumption closer to the PowerTune TDP, rendering it somewhat useless... I remain somewhat unconvinced with the 69XX series.

Well i can see them squeezing a good bit more performance out of the drivers as they are in their infancy.

So far ive managed to get the card to hit 88c in the Heaven benchmark, but the fan wasn't as obtrusive as people are trying to make out. That being said, amd really should have done a better job with the cooler, its not bad but its been quite a while since i had a single gpu amd card that almost cracked the 90c barrier, the last one was the x1900xtx.

Performance in games so far for me seems notably better than my quad sli setup. Ive seen bad company 2 in mp at 1080p hit over 120fps on occasion. In eyefinity at 5760x1080 with effects at medium and textures i see 50-70fps. And in that game setting textures to medium really makes no difference as the high res textures aren't noticably any better than medium imo. Pretty sure i had some fsaa going on as well.

The one dog ive tried so far was black ops, at 5760x1080 i was getting around 30-40 fps in mp, not sure what the story is there, might have had something enabled that was holding it back, been a while since i used the amd control panel.

Considering a second card for crossfire as im a whore for cranking things as high as i can get away with. So far though i think its a solid card that has a good deal of potential if more performance can be unlocked via driver updates.
 
I've been benching an overclocked 6970 at 949/5896 (using 1.24V via ASUS voltage tweak) but the performance gain is very slight compared to stock performance. So if the GTX570 is on par with a stock HD6970, when both are overclocked, I can see Nvidia taking the lead easily.

The only way I can explain the small performance increase when overclocking is PowerTune kicking in and limiting the clock speed. Whilst voltage tweaking helps improve headroom, it also increases power consumption closer to the PowerTune TDP, rendering it somewhat useless... I remain somewhat unconvinced with the 69XX series.

Turned powertune up? Benched with it at default then turned up to validate your scores.

I got a 7% boost in 3d mark 11 by running at 950/1375 (No extra voltage needed.) so for me it seems about right
 
Ocp:-
"For the next review were not going to test these cards at 1080p were going for resolutions such as the old favourite 3280×2048 WQSXGA because this is where ppl play games at......."

facepalm-face-palm-facepalm-demotivational-poster.jpg

Its how OCP have always done it to cater to the community there.

If you don't like it, uhm, don't read it.

There's loads of reviews of the 69** on the net, some like them, some don't. Pick which one you want to believe and go with that.

It does appear you already have though as your constant waffle and repeating of any statements you can find on the net that suits your train of thought gives it away a touch.
 
It's very easy to see how they (OCP) came to the conclusion given the higher resolution they use. The results seem perfectly valid to me, given the resolution. No need to **** them off, just kindly point out to people that the results are for the higher resolution and that it's more even at lower values. No need for drama :-)
 
^^^
The problem becomes when review sites are biased and deliberately bump the resolution to suit a 2gb VRAM card.
Hence any reviews done in a single resolution are pretty pointless.
 
I was under the impression that OCP always did high res and ignored the lower values? (I don't read HardOCP). I completely agree that a single res bench is pointless. It's like testing a sports car only on the motorway, and completely ignoring the smaller roads even though the majority use them.
 
He benches ones res and concludes the 6970 blows the 570 out of the water, it's miss leading, read anandtechs review for an unbiased opinion, and yeah they bench a number of resolutions including the one res OCP bench at.
 
It's pretty clear in all the review that once the resolution is upped to beyond 1920x1200, the gap between the 580 and 6970 closes. IIRC the same happens with the 570 and the 6950.

I've not seen anything to believe that the 580's 1.5GB framebuffer is limited even at 2560x1600, unless somebody can point me at a review which shows otherwise? (now the 570s 1280MB might be a different story...).

For people who game at 2560x1600 + (i'm not one of them!), the 6970 is a compelling product since a) Its in stock; b) it's close to the 580 in performance; (c) has more vram; (d) is 150 quid less expensive.
 
Last edited:
OCP's benchmark is valid, but only for the ~1% of gamers who game abover 1920x1200, where a massive framebuffer becomes useful.

Running at 1920x1200 and using Afterburner for measurement, I have seen my 580's using over 1300MB of gddr3. I can well imagine that 2560x1600 will use greater than 1536MB and may even get close to 2GB.
 
Although I agree that OCP should have tested the cards are varying resolutions, but in the near future we will expect higher resolutions anyway.

It might be ~1% of gamers who game above 1920 X 1200 but in a couple of years there will be more of us.
 
Although I agree that OCP should have tested the cards are varying resolutions, but in the near future we will expect higher resolutions anyway.

It might be ~1% of gamers who game above 1920 X 1200 but in a couple of years there will be more of us.

But it's now, not in a couple of year's. The high-end card will be the low end by that point (or EOL).
 
Back
Top Bottom