• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

R9 290X Crossfire VS R9 Fury X?

The total score, not the graphics or cpu score


You should ignore the pscore and concentrate on gscore. Pscore or total score weight includes cpu performance. Thus differences in cpu can drastically affect the pscore. Gscore on the other hand more or less ignores cpu scoring or tries to. That said, two high clocked 290x with hexacore will hit over 22-23K pscore with 29K gscore on FS leaving a Fury X in the dust.
 
You should ignore the pscore and concentrate on gscore. Pscore or total score weight includes cpu performance. Thus differences in cpu can drastically affect the pscore. Gscore on the other hand more or less ignores cpu scoring or tries to. That said, two high clocked 290x with hexacore will hit over 22-23K pscore with 29K gscore on FS leaving a Fury X in the dust.

Yes this is true, i should have noticed that before i put the numbers in....

Here are the gpu results for both cards standard and overclocked (same settings as graphs)

3D Mark 11:

290X crossfire standard = 26745
290X crossfire overclocked = 30122

R9 fury X standard = 19042
R9 Fury X overclocked = 19950

FireStrike

290X crossfire standard = 20647
290X crossfire overclocked = 23089

R9 fury X standard = 15611
R9 Fury X overclocked = 16147



Just to make it clear that this is two R9 290x cards vs a single R9 Fury X, not two cards.
 
I will say it is defiantly worth it, a fury X is very close to the 290x CF
perfrel_2560.gif


I have done something similar, going from 7950(oc) CF to a 390(oc)

7950(oc) CF -> 3173(unigine valley)
R9 390(oc) -> 2834(unigine valley)
12% loss in performance in scenarios with extremely good CF scaling
basically in most games my fps increase, and in some I almost got double the performance (no CF support:p)
 
but that's only down to amds poor driver support :( I with they would get crossfire up to scratch make them so much more competitive
 
but that's only down to amds poor driver support :( I with they would get crossfire up to scratch make them so much more competitive

Think they've pretty much given up now swimming against the tide in regards to devs/gameworks screwing things up for AMD/or AMD not having the resources to work with them (see PCars for an example)

I think they're putting all their eggs into DX12 now, which will make mGPU a *lot* simpler!

I don't think I'll consider making the switch back until DX12 are the norm tbh, had my fill of Crossfire issues when I had my 295x2 :(
 
Having come from a very similar situation (Titan Black SLI) to a Fury X I can safely say that the Fury X is the better choice.

Sure, in benchmarks the Titans won, but when I get down and dirty in games the Fury X is by far and away the better choice. In most games it is a good 10% faster (GTAV) and in some it's slightly slower (Dying Light) but it's just so much less hassle.

Playing through Dying Light has been a much more consistent experience. With SLI I would get areas of slow motion lasting between 5-30 seconds which ruined the game. I think I would rather play it in the 40s and not have those issues.

Games like Metro and so on though? just very consistent with GTAV, I actually get a good 5-10% more performance.

And Witcher 3? well thanks to a Gsync bug that was unplayable on my Titans. It runs bloody lovely at high/ultra settings with no AA on the Fury X.

I don't think I will be going multi GPU again any time soon. You just don't realise all of the problems that come with it. Even SLI which is very well polished can be a complete sod.
 
I don't think I will be going multi GPU again any time soon. You just don't realise all of the problems that come with it. Even SLI which is very well polished can be a complete sod.

Am I playing the wrong games or something? I hear a lot of people complaining about SLi/xFire issues as though they're a common occurrence and I've pretty much yet to have any. There's sometimes some teething issues the first week after a big game (ie GTAV or Witcher 3) but there's been nothing that was game breaking or anything for me - Witcher 3 had a texture flicker which was annoying but pretty manageable, GTA just had poorer performance..
 
Am I playing the wrong games or something? I hear a lot of people complaining about SLi/xFire issues as though they're a common occurrence and I've pretty much yet to have any. There's sometimes some teething issues the first week after a big game (ie GTAV or Witcher 3) but there's been nothing that was game breaking or anything for me - Witcher 3 had a texture flicker which was annoying but pretty manageable, GTA just had poorer performance..

There are many things that come with running more than one GPU, it's not just limited to the game not working at all.

The big question of course is your tolerance levels and how much you are prepared to put up with.

I ran SLI from 2007 all the way up until about six weeks ago and whilst for the most part it did what it should have it bought problems too. As I said, in Dying Light I would get great FPS @ 4k but with that came issues. When things were moving very quickly it would become stuttery just when you don't want it to. I also had the issue I described where the entire game would slow to a crawl and that would last anywhere between around five seconds and thirty seconds.

It was a well known issue, and IIRC it was fixed but it did take away my enjoyment of the game until it was fixed.

Witcher 3? hah ! that was just a pure mess in a dress on both Kepler and SLI. At first it was stuttery and I couldn't even get 30 FPS at high settings, then with the settings lowered to medium (where tbh it looked like 1080p) it flickered like a pig because apparently there was an issue with SLI and Gsync.

So there you have it, the 'I' word. Issues. All of which need to be suffered, overcome and waited on. Issues that don't exist when you are only running one card.

It was brought up lately that only around 300k users worldwide used more than one GPU. I would hazard a guess and say that even if those numbers are wrong the people who do run more than one GPU are very, very much in the minority.

As such the companies who own these technologies do not put enough time, money and effort into them and this is why you get issues.

Even when SLI does work properly and to its full capacity there are always tiny niggling little issues that come with it.

And Crossfire? well that is considerably worse. AMD cared so little about it that for years it was completely broken and didn't work properly. Even now, after they were caught, they still don't put enough time and effort into it. Why? all you need to do is look back and look at how few people use it. Why put time, money and resources into something that hardly any one has?

All of this shows when you get down and dirty and actually play games. None of the issues of a multiple GPU system show their heads because you're not taking the risk any more.

I promised myself a Fury X. I got one.
The next month I was dead set on getting a Fury to go with it. Instead I got a new coffee machine, 200 pods for said coffee machine, a 480gb SSD, a very expensive case for my phone, etc.
The next month I promised myself a Nano to go with my Fury X for Crossfire. Then I find out that it won't work in Crossfire and AMD want you to offer up a kidney for it.

£550 or more to add issues to my PC? sorry matey, don't think so !

I've been told elsewhere that Witcher 3 still does not work with Crossfire, so it would be £550 (or more gauging prices !) for nothing but crap.

Not going there. This is the first time I have seen completely issue free gaming in about eight years. I'm not really wanting to go back tbh.
 
Crossfire has worked for the most part for me, the only game that annoys me is Elite Dangerous which STILL doesn't have a Crossfire profile (I'm looking at you Matt :P ). Needless to say, i'm gonna hold onto my 290X's until the die shrink next year with Arctic Islands.
 
Crossfire will be more CPU limited than a single card. Even then I'd only be going AMD for DX 12. DX 11 optimisation is still abysmal.
 

Whoooaaa there. Since when was Crossfire "worse"? It's been noted in just about every tech site there is that Crossfire scales significantly better than SLI.
But past that, we'll have to agree to disagree about the severity of issues surrounding multi-GPU use - you've obviously had some bad issues that I haven't and thus you hold a lesser opinion on it. Personally, I think it's pretty much a necessity for 4k at this point - a single card just can't cut it for me, 40fps on med-high is playable but not as enjoyable as 60+ on ultra :P
 
Whoooaaa there. Since when was Crossfire "worse"? It's been noted in just about every tech site there is that Crossfire scales significantly better than SLI.
But past that, we'll have to agree to disagree about the severity of issues surrounding multi-GPU use - you've obviously had some bad issues that I haven't and thus you hold a lesser opinion on it. Personally, I think it's pretty much a necessity for 4k at this point - a single card just can't cut it for me, 40fps on med-high is playable but not as enjoyable as 60+ on ultra :P

A 980 Ti with a decent overclock and AA disabled can easily make 4K Ultra.
 
Whoooaaa there. Since when was Crossfire "worse"? It's been noted in just about every tech site there is that Crossfire scales significantly better than SLI.

Actually they are about even.

If people want to start cherry picking games like some sites do then they will always find games that support their side of the argument.
 
Actually they are about even.

If people want to start cherry picking games like some sites do then they will always find games that support their side of the argument.

Eh, on the whole Xfire seems to come out on top in terms of scaling (from what I've seen at least) usually scaling 75-85% opposed to SLi's 70-80%. Sure it's not a huge difference and there are definitely areas of overlap between them game to game but if I were a betting man for any given random title the chances are xfire would perform better.

I don't think a single 980Ti is going to do 4K Ultra at 60fps.

It can't. So you think right :P
 
Eh, on the whole Xfire seems to come out on top in terms of scaling (from what I've seen at least) usually scaling 75-85% opposed to SLi's 70-80%. Sure it's not a huge difference and there are definitely areas of overlap between them game to game but if I were a betting man for any given random title the chances are xfire would perform better.

So what went wrong here

Witcher 3 maxed settings
2160p

4 Fury Xs stock

asVHH9i.jpg




4 EVGA SC Titan Xs stock

cHrljmR.jpg




It is dead easy to pick benchmarks to suit your argument but in practice SLI and C/F come out about even. It is all about swings and roundabouts.:)
 
Something's definitely wrong there Kaap - my 2 Fury X's get over 60 for the most part and certainly never as low as 25. lol

Nothing wrong just max settings @2160p.

Edit Just tried it on 2 Fury Xs at the same settings and got 46fps, a performance increase !!!

It just underlines my earlier point about SLI and C/F being about even, after taking into account the swings and roundabouts.
 
Last edited:
Hah, told you something was wrong :P Even then, 46 is a little low - I take it you have HairWorks maxed and AA on?
Regardless, you would agree with me disputing the claims of crossfire being "worse," which I was originally countering? :P
 
Hah, told you something was wrong :P Even then, 46 is a little low - I take it you have HairWorks maxed and AA on?
Regardless, you would agree with me disputing the claims of crossfire being "worse," which I was originally countering? :P

Everything on yes

Crossfire is no worse than SLI agreed.

If anyone wants to use 4 cards in SLI on Atilla Total War they would get a shock.:D
 
Back
Top Bottom