• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Things to consider before going crossfire AMD 7950.

Associate
Joined
9 Sep 2008
Posts
1,580
Location
United Pingdom
Hello Guys,

I am very close to pulling the trigger on a second 7950. Not because the games I have are stressing the card out, its just I thought it would be a good time to get one while they are cheap as they are at the moment. I have a few questions which I hope you guys can address for me.

1. Card matching
My current card is a reference 7950 VTX3D one, which I think is very close to, if not the reference cooling system etc. Its clocked at 800mhz, there aren't many stock cards in any of the retailers i've been looking at (for near £220). I know they had a cracking HIS deal on here which I missed, so I will have to get a card which is slightly overclocked. Now to get this to run my VTX will I have to under clock it via afterburner?

2. Micro stutter
Has this truly been eliminated via this radeon pro programme? I read on wikipedia that it eliminates the issue of micros stutter, I just wanted to know whether this will be a stumbling block. I don't want to drop cash to find out that I will now get new issues :P

3. Heat & power
As it stands, in full load the card gets to 54c.. which make me worry if I do get a crossfire set up this will push up the temps. I am thinking of going for an aftermarket cooler card so I can place it atop the current card to avoid further heating issues. However have you fellas experience toasty cards with crossfire?

In terms of wattage, will a 750Watt PSU be okay, or do you reccommend getting something a bit more powerful to power the card?

4. System Bottlenecks
I have 8GB of memory and a none overclocked core i7 3770K, in your experiences do you think I would need to increase the memory and possibly overclock the CPU too?

Sorry its a bit of a long one, hopefully the answers will help others on the quest to move from single to double GPU in AMD :)
 
1) you can run two cards at different clock speeds. If your second card is faster then this will run at the faster speed. Ideally, you'll want them running at the same speed and you could always overclock the slower one up. Normally a moderate overclock of 1050 MHz on the core would net you a decent amount of extra performance.

2) it's still present in some games more than others but its always going to be prevalent in some form its just about whether you can perceive the micro stutter or not

3) what PSU is it? Broadly speaking 750W will be fine but need to confirm which one first. Anything under 80c is more than fine for a 7950 so you've hot a lot of headroom with the temperatures before you need to worry. Depending which second 7950 you go got if nay be worth having your current 7950 in the bottom slot rather than the top.

4) you'll want to overclock your CPU. There's no real reason not to. Even to 4.2-4.4 GHz will help a lot. Rest of machine is fine
 
Hi Rusty,

Thanks for getting back to me, its an OCZ 750Watt PSU
OCZ ZT 750W '80 Plus Bronze' Modular Power Supply
 
If you know what you're doing with RadeonPro you can eliminate microstutter completely or reduce it to below that of single card levels. If it doesn't work then that's typically a specific game or driver issue, but it should work for 95% of stuff. Made up figure but you get my point.

The best way to do it is create a profile for said game, change flip que size to 1 (under the advanced tab - see pic below). Enable DFC. (dynamic frame rate control) Select vsync (double buffered) and stick 58 in the keep up to box. Use 58 as 2 fps below vsync rate of 60 to remove input lag. You're good to go.

pBBpIHK.png


Its possible to do it without using vsync as well but that depends on if you can put up with screen tearing or not. You can leave vsync as driver default (game setting dependant) and put in your average fps in the keep up to box to play without vsync if you want.

If you use a 120hz monitor and providing you can keep the fps high enough then you can put it at 75fps or whatever.

This has been confirmed as working by Rossi on borderlands 2. A Nvidia TWIMTBP title, a game that AMD cards have had notorious micro stutter problems with. If it works like that then you can get your bottom dollar it will work 50 times better on gaming evolved titles of which there are absolutely loads and expect a hell of a lot more.

EDIT

Also look at this lovely random graph. I could tell you where its from, but im sure you can find it.

YU2D2MX.png


:D
 
Last edited:
Cheers for the help guys :)

I decided to go for a Gigabyte 7970 on the B-Grade thing.. if it arrives I'll be using that.
 
radeon pro isn't a magic bullet - it can work very well in some situations to reduce or eliminate microstutter but generally you have the following to take into account:

-Often need to know in advance your likely average and minimum sustainable framerates to get the best results

-May need to use framerates below what your actually comfortable with if you want to eliminate microstutter

-Can end up trading microstutter for increased levels of input latency


EDIT: The reviews on it are generally misleading as they've run through the benchmarks in advance, found out exactly what level they can cap it at to get the best results and present that information, in actual gameplay you will probably have to cap it at a much lower level to ensure you get the same kind of uniform frametime output.
 
Last edited:
When it comes to microstutter generally if one card is getting in the 40-60fps region and you use a 2nd card to make sure your always getting 60+fps then it works out pretty well, when microstutter tends to become a problem is when your struggling to get playable framerates from one card alone - adding a second then even tho you might end up with playable framerate figures is often plagued with issues - whether thats because your using 2x low end GPUs just to get 30fps or running 3x 2560 panels and needing SLI'd Titans just to get 30-40fps neither will give a great experience.
 
Often need to know in advance your likely average and minimum sustainable framerates to get the best results

That's not hard to work out really. Generally you can just use 60fps+. Its pretty easy really and takes about a minute to setup. Depending on the game you can go much higher than that of course.

-May need to use framerates below what your actually comfortable with if you want to eliminate microstutter

60 is plenty tbh. Depening on the game you can go much higher smoothly as well.

-Can end up trading microstutter for increased levels of input latency

Wrong. Vsync is optionally and definitely not required to elimindate micro stutter from AMD cards. Its just an option if you don't like screen tear, like me.
 
Last edited:
Depends I guess on your approach to a game, if you go with as low settings as possible to make sure your getting 60+fps constantly or spend an absolute ton to get the hardware to get 60+fps constantly regardless of settings fair enough - your telling me that for instance someone on the average gaming PC, even enthusiast level multi GPU, is going to buy say Crysis 3 and know what framerates they are going to be getting throughout the game in advance?

The input lag bit is nothing to do with vsync, the lower you cap your framerate the higher input latency becomes naturally - once your capping significantly below 60fps which will be needed in many cases to get the kind of graphs shown in the reviews your going to be introducing quite a bit of extra input latency potentially enough to be quite noticeable depending on various factors including how sensitive you are to it.
 
Depends I guess on your approach to a game, if you go with as low settings as possible to make sure your getting 60+fps constantly or spend an absolute ton to get the hardware to get 60+fps constantly regardless of settings fair enough - your telling me that for instance someone on the average gaming PC, even enthusiast level multi GPU, is going to buy say Crysis 3 and know what framerates they are going to be getting throughout the game in advance?

The input lag bit is nothing to do with vsync, the lower you cap your framerate the higher input latency becomes naturally - once your capping significantly below 60fps which will be needed in many cases to get the kind of graphs shown in the reviews your going to be introducing quite a bit of extra input latency potentially enough to be quite noticeable depending on various factors including how sensitive you are to it.

If you have an xfire setup you won't be running low details because of lack of gpu grunt unless you're using very low spec cards or you are trying to apply unrealistically high settings or 24xAA edge detect.

You won't know your average frame rate until you play the game. However you can play the game first to see if you suffer from micro stutter. It only affects some games. If it turns out the game in question suffers from it then you can apply whatever your average fps was. Its easy Roff. :)

Having used RadeonPro with Xfire and single gpu that is not the case. Fps capping does not add any input lag between 45-60. However i normally use 58 as i like to use vsync and that removes the input lag causes by the double buffer. The frame time on 58 is 17.2ms for your reference. Smooth.
 
You won't know your average frame rate until you play the game. However you can play the game first to see if you suffer from micro stutter. It only affects some games. If it turns out the game in question suffers from it then you can apply whatever your average fps was. Its easy Roff. :)

As per my original point, it works well in some situations but its not a magic bullet to remove microstutter, you'd have to play the game a bit, find out that it does indeed in this instance suffer from noticeable levels of it, work out at what level to cap it based on what you've played so far, hope the rest of the game works at that level cap and possibly end up capping it too low if the bit of the game your having issues with is more intensive than the rest of the game happens to be or too high if further into the game it performs even worse.

My goal here being I don't think its wise to give the impression that for the average consumer microstutter is now "solved" end of story, there are indeed tools to deal with it but there isn't one elegant, foolproof solution.

EDIT: Of course in many situations especially with a bit of an educated guess and good hardware you can get very good results but its not always the case.
 
Last edited:
As per my original point, it works well in some situations but its not a magic bullet to remove microstutter, you'd have to play the game a bit, find out that it does indeed in this instance suffer from noticeable levels of it, work out at what level to cap it based on what you've played so far, hope the rest of the game works at that level cap and possibly end up capping it too low if the bit of the game your having issues with is more intensive than the rest of the game happens to be or too high if further into the game it performs even worse.

My goal here being I don't think its wise to give the impression that for the average consumer microstutter is now "solved" end of story, there are indeed tools to deal with it but there isn't one elegant, foolproof solution.

You're making it out to be difficult though, when its anything but difficult. Providing your multi gpu computer can keep fps to at least 50fps (which will add no latency and be extremely smooth) then you will remove micro stutter completely. This is worse case scenario. More than likely you can have the fps limit much much higher than that. Its not 100% perfect, but if micro stutter is such a big problem for the person using a multi gpu setup then yes it is a extremely good solution as it removes the problem they are having, which is micro stutter.
 
I'm not making it out to be difficult, I'm making it out to not be the complete solution some people try to present it as and that the graphs that people see on the reviews/forum posts on it aren't representative of the end results people will in most cases get. A lot of those graphs required (as per the one you linked above) going well below 50fps to get results where microstutter is almost entirely eliminated most of them are capped at 40fps or even in some cases as low as 30-35fps. (Thats not to say that on a high end rig i.e. 7950 CF using 50fps won't give very good results).
 
I'm not making it out to be difficult, I'm making it out to not be the complete solution some people try to present it as and that the graphs that people see on the reviews/forum posts on it aren't representative of the end results people will in most cases get. A lot of those graphs required (as per the one you linked above) going well below 50fps to get results where microstutter is almost entirely eliminated most of them are capped at 40fps or even in some cases as low as 30-35fps. (Thats not to say that on a high end rig i.e. 7950 CF using 50fps won't give very good results).

Its the 95% complete solution as it removes micro stutter completely. Despite what it says in that review you do not need to limit fps as low as 30-35 to remove micro stutter. How do i know this, because ive used xfire and used it myself. Providing you have the gpu grunt to deliver acceptable fps, which you generally do on a multi gpu computer then it is a very good and capable solution. I've tested as low as 45 fps for comparison using this method and guess what, it felt identical to 75fps. The reason for this is the rapid change in fps and frame time is what causes the micro stutter. If every fame is limited and delivered at the same ms time then its smooth. Yes even at 45fps. 58-60 is ideal in my eyes though. Beyond that just heats the gpu's up more.
 
Last edited:
When it comes to microstutter generally if one card is getting in the 40-60fps region and you use a 2nd card to make sure your always getting 60+fps then it works out pretty well, when microstutter tends to become a problem is when your struggling to get playable framerates from one card alone - adding a second then even tho you might end up with playable framerate figures is often plagued with issues - whether thats because your using 2x low end GPUs just to get 30fps or running 3x 2560 panels and needing SLI'd Titans just to get 30-40fps neither will give a great experience.
Can anyone else vouch for this?
 
Can anyone else vouch for this?

Me.

2Xgpus@60fps=good performance in general.

If 2xgpus are struggling@30-40fps at any scenario, gameplay will be considerably choppier using the 2 gpu's in a SLi/CrossFire combination than what a single gpu@30-40fps would achieve, even if the numbers are the same.
 
Back
Top Bottom