• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Sapphire HD7970

Soldato
Joined
24 Dec 2004
Posts
18,880
Location
Telford
One of my mates picked his up yesterday from his usual source. I had a quick look and its certainly a quick card easily matching and improving on his previous crossfire 6950's but my God its loud. Running bf3 the card is loud enough to be annoying, run fur mark and my Hoover is quieter. Its lovely and silent in windows but push it and it is a very noisy card.
 
Soldato
Joined
24 Dec 2004
Posts
18,880
Location
Telford
Moogleys is that noisy card he's using a reference model, and is it running standard clocks + voltage with auto fan control?

It was running stock as far as I'm aware. Its identical to the one above although I'm not sure what brand if that makes a difference.
I have tried to get more info from him but he is hard to work with and tbh a pain for any info.
I was litteraly only there 10 mins as the wife insisted I go say hello, he is weird to say the least....:)
 
Soldato
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
Yup, exactly, always been that way for me, though I'd go one step further, I laughed at people back when FEAR 1 came out and they insisted on killing their performance and their experience by turning soft shadows on. Good attempt, they tried to move graphics forward but it looked awful, less realistic and killed performance.

So on top of what you said, I won't use settings beyond where I can see the difference, I also won't randomly turn on every single "ultra" setting, just because its deemed the top setting, if in reality it is actually reducing IQ, quite a lot of games have top end settings that actually reduce IQ while also reducing performance.

Motion blur on many games is a joke, DOF on many games just makes it unrealistic, though some games do DOF very well.

Indeed i could have gone on to included more as i tend to do the same with Motion Blur,DOF and bloom, not found a game where i like them on yet, some shadow settings and techniques like HBAO i don't like, SSAO is sometimes ok. Some post processing also gets killed off when possible if its not to my liking, so really there are few games that i really max out.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
33,188
Indeed i could have gone on to included more as i tend to do the same with Motion Blur,DOF and bloom, not found a game where i like them on yet, some shadow settings and techniques like HBAO i don't like, SSAO is sometimes ok. Some post processing also gets killed off when possible if its not to my liking, so really there are few games that i really max out.

Yup, same, I find it hilarious when someone goes on about, say when Metro was out and Nvidia guys were talking about their lovely 30fps average with dips only into the low 20's, while I was playing with a 40-50 average with dips into the 30's, with tessellation and DOF disabled, with for me better IQ. Tessellation WAS an improvement in that game, DOF for me was awful, the problem being tessellation was almost completely not noticeable and DOF was extremely noticeable and utterly destroyed performance.

I'm all for highest IQ, I'm not for arbitrarily running "max" settings because some game menu tells me it's a "higher" setting so therefore must look better.

Can really only mention both the latest COD and the TOR beta, where setting "highest" settings in both games actually set much lower settings due to bugs.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
It was running stock as far as I'm aware. Its identical to the one above although I'm not sure what brand if that makes a difference.
I have tried to get more info from him but he is hard to work with and tbh a pain for any info.
I was litteraly only there 10 mins as the wife insisted I go say hello, he is weird to say the least....:)

In a lot of cases, both AMD and Nvidia, default card settings often have to cope with someone who puts their cards in stupid xfire setups, or those numpties daft enough to put one in something along the lines of a shuttle type case. IE default fan control is usually WAY to aggressive.

I'm not saying the card isn't loud and doesn't need to be, just that over the past 10 years of buying cards there hasn't been a single reference or non reference heatsink card from either company I haven't found could be made vastly quieter with almost identical temps simply setting the fan to lower speeds myself. I'd guess the same is true for the 7970, you can probably turn the fan down dramatically in speed, have a huge change in noise, but very little change in temp.

I still don't know why AMD/Nvidia are so aggressive, ramp up the fan speed at crazy dangerous temps, not stupidly low temps, and god damned blower fans, they are useless, flat out, they should only be used as default on an edition branded something like tri/quad xfire(sli for Nvidia cards). For single/dual card users the standard heatsink should be all but silent and not a blower design and for the 0.01% of cards they sell that end up in something trifire or above, they can buy a stupid blower loud as crap edition.
 
Associate
OP
Joined
7 Jan 2011
Posts
146
metroset.jpg


default
metrodef.jpg


1125/1575
metrotez.jpg
 
Associate
Joined
5 Nov 2011
Posts
403
It gets more than double the 3DMark of my 6870 (P4460) - what CPU/RAM do you have?

I don't think I could justify paying £485 for it personally. I mean I only paid £120 for a card which is half the performance.

Roll on HD8000 series! lol
 
Associate
OP
Joined
7 Jan 2011
Posts
146
Here is the first unigine.

uniginez.jpg


And here is with msi afterburner and default core voltage.

ab1jv.jpg


Tomorrow I will play with Asus bios that unlock core voltage.
 
Soldato
Joined
13 Aug 2008
Posts
7,069
What if we start seeing a visible difference? 7970 owners are going to be left with either reduced IQ or 560TI-570GTX level of performance.

For a £450 card it's unacceptable imo, AMD/ATI have historically usually been the cheaper option so trade-offs like that were more acceptable.

You won't, there's only so much difference slightly rounder polygons can make.
 
Caporegime
Joined
26 Dec 2003
Posts
25,666
I have seen that guy commenting on many review articles. He really has a dislike for AMD it seems for some reason.

It does not change the fact that AMD tessellation is still subpar at higher levels, like he says he is a developer but as a gamer it bugs me that if I spend £450 upgrading my 580GTX to a 7970 that I'll likely at some point have to play around reducing tessellation levels from that intended by the game developers, when AMD could have just spent the past 18mths fixing their tessellation performance instead of whinging about games using too much.

Back in the early noughties when a GPU maker cut corners in rendering (even if there was no visible difference) there was a massive uproar because it was seen as cheating and yet today it seems to be accepted practice, it's strange how as GPU power has increased our standards have decreased. I guess there are just new generations of non technical PC gamers coming through who care only about fraps numbers.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,846
Location
Planet Earth
It does not change the fact that AMD tessellation is still subpar at higher levels, like he says he is a developer

Why don't you look through his blog posts for the last two years and his comments on Anandtech articles. You should have just linked to the German article as that was an interesting read.

I ended up wasting a few minutes reading his blog and I like how in one of his articles(basically the one he call anyone who has an AMD product an idiot) he ended up deleting all the comments he did not like.

but as a gamer it bugs me that if I spend £450 upgrading my 580GTX to a 7970 that I'll likely at some point have to play around reducing tessellation levels from that intended by the game developers, when AMD could have just spent the past 18mths fixing their tessellation performance instead of whinging about games using too much.

Back in the early noughties when a GPU maker cut corners in rendering (even if there was no visible difference) there was a massive uproar because it was seen as cheating and yet today it seems to be accepted practice.

It's strange how as GPU power has increased our standards have decreased, I guess there are just new generations coming through who care only about fraps numbers.

The problem is that there might be situations where games use larger tessalation factors but when?? This is a question you need to ask yourself even when it comes to upgrading to the Kepler based replacement for the GTX580.

If you look at the HD7970 reviews it tends to be ahead in all DX11 games(I think) ATM when compared to a GTX580 and that is with the "weaker" tesselation ability at higher levels. Some of these games are using newer DX11 engines such as CryEngine 3 and Frostbite 2. Crysis 2 uses a large amount of tessellation(at times it is OTT according to some articles) and Lost Planet 2 also uses quite a reasonable amount too and the HD7970 seems to perform reasonable well in these games. Unreal Engine 4 won't be out for a while too so maybe that might be more taxing?? I simply don't know.

If anything,the abilities of the new consoles will be important to follow too. It looks like the Wii U will be DX10 or DX10.1 capable and AFAIK the PS3 and XBox 360 replacements will be out in late 2012 or 2013.

It will be at least a couple of years until most games might start to push the tessellation abilities of an HD7970 to a level below that of a GTX580 or even take advantage of the theoretically higher abilities of a GTX580 IMHO. However,at that point you might end up with some other bottleneck with your card which means you will have to upgrade anyway. If you look at the graph comparing the GTX560TI and the HD7970 at higher tesselation levels they are similar but AFAIK the GTX560TI is still slower in DX11 games.

TBH,I would just stop worrying and enjoy what you have. Its not like the GTX580 is a slow card in the first place! :p
 
Last edited:

J.D

J.D

Soldato
Joined
26 Jul 2006
Posts
5,223
Location
Edinburgh
^^ CPU is clocked lower on the 7970 BM :o (3.8ghz as opposed to 4.2 with the 6970's)
would it make much difference if the 7970 BM was @ 4.2ghz as well?

The GPU clocks are what are more important on the 6950s as they may have been capable of 1Ghz on the core which would have changed the result. But..... when the card can be volted and pushed then it may be a whole different story. Regardless, the 7970 is a great card, bit expensive but what = good = usually more money.

I don't think it would make much of a difference @ 1080p 8xAA in Heaven, more on the GPU so 3.8-4.2Ghz on i7 would be pretty similar IMO.
 
Soldato
Joined
11 May 2006
Posts
5,769
Gotta agree with the comments being made about not enabling every setting. Metro2033 is a prime example. DoF and MSAA are the biggest performance killer in that game yet turning them off actually improves image quality, lol. Tessellation makes a fairly noticeable difference, but it's mostly a superficial difference that goes unnoticed once you start to play the game. In fact there's quite a few games that have these stupid features that seemed to have been tacked on at the last minute, add very little to end image quality yet destroy performance.

Also the whole tessellation farce between nvidia and AMD is blown way out of proportion. By the time a 7970 hit it's tessellation limits, nvidia will be having a pretty hard time too, especially in actual games where those nvidia cuda cores will be busy will all that pixel/vertex shading - where do think nvidia is getting all their tessellation performance from?
 
Back
Top Bottom