• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Can my system handle another r9 290

Andy check that chart/bench you used to argue against me with a SINGLE nVidia GPU! A bloody i3 kept up in those benches with i5 and i7 counterpart. Bloody hell yea the CPU race is over...

You linked a crappy benched game and so off from what OP is asking. He's asking about AMD GPUS for a start so a nVidia game benched with a nVidia gpu is relevent how? And its a very unoptimised game.

And your telling me that AMD cpus are more powerfull than Intel cpus?

"All out balls to the wall your CPU is not as powerful as a FX 8"
His i5 3570k at 4.5Ghz is less powerful than a FX8 series CPU haha okay! Guess thats why AMD CPUs are selling like hotcakes then mate. And i guess thats why enthusiasts are buying AMD cpus for 2+ GPU setups?

There is bottlenecking with heavy CPU usage with 2 high end GPUs! Why on earth would AMD even go ahead with Mantle to alleviate the CPU overhead if their CPUS are better than Intels?

Ill tell you why to make Intel's CPUs seem less relevant and make their cpus a better choice when it comes to value if you get get rid of that CPU overhead.

Give it a break mate!



edit -

Quoted from the bench you linked with single GPU

When it comes to CPU performance, those running a Core i5 or Core i7 processor won't have to bother with overclocking. It's possible to max out a GTX 980 at 1080p with a clock speed of around 3GHz. To achieve the same feat with an AMD FX CPU you'll want to aim for at least 4.5GHz.

Yea looks like your right! Thats just to use a single GPU to max out the game. Imagine if you used two GPUS on those FX clocked at 4.5Ghz daym might struggle sum huh? And this is with a game that seems to not be fussy on CPU performance. Imagine games that are?
 
Last edited:
The consoles are struggling to maintain even 30fps in most games being released so how can you use them as evidence that more cores and a massive amount of threading is good for gaming? the fact is it would be far easier for console developers to have hit and maintained 60fps in games had Sony/MS gone with even a fast dual core like a Celeron/Pentium. It's obvious that price (of an APU versus separate CPU/GPU) was the primary factor behind AMD winning the console contracts.

AMD are going back to bigger/fewer cores in 2016 so they obviously accept that they got it wrong.

Linus Torvald has had his say as well:
http://highscalability.com/blog/201...allel-computing-is-the-future-is-a-bunch.html

The whole "let's parallelize" thing is a huge waste of everybody's time. There's this huge body of "knowledge" that parallel is somehow more efficient, and that whole huge body is pure and utter garbage. Big caches are efficient. Parallel stupid small cores without caches are horrible unless you have a very specific load that is hugely regular (ie graphics).

Nobody is ever going to go backwards from where we are today. Those complex OoO [Out-of-order execution] cores aren't going away. Scaling isn't going to continue forever, and people want mobility, so the crazies talking about scaling to hundreds of cores are just that - crazy. Why give them an ounce of credibility?

Where the hell do you envision that those magical parallel algorithms would be used?

The only place where parallelism matters is in graphics or on the server side, where we already largely have it. Pushing it anywhere else is just pointless.

So give up on parallelism already. It's not going to happen. End users are fine with roughly on the order of four cores, and you can't fit any more anyway without using too much energy to be practical in that space. And nobody sane would make the cores smaller and weaker in order to fit more of them - the only reason to make them smaller and weaker is because you want to go even further down in power use, so you'd still not have lots of those weak cores.

Give it up. The whole "parallel computing is the future" is a bunch of crock.

Emphasis is mine as that's basically what AMD did with their modular architecture.
 
OP: Is there a benchmark you'd like me to run to show you what sort of FPS you could expect from an FX-8xxx, crossfired 290 system?
 
You linked a crappy benched game and so off from what OP is asking. He's asking about AMD GPUS for a start so a nVidia game benched with a nVidia gpu is relevent how? And its a very unoptimised game.

I linked the absolutely newest game there is and there is no discernible difference between a wide range of CPUs.

Not being very well optimised has nothing to do with it. If it's crap then it will run like crap on any system.

As for the rest? not getting into it. Go away Intel Witness and stop banging on doors trying to convert people to your beliefs.

AMD FTW.
 
Im running 3x290 on a 9590 at stock speed at the moment, 2x 290 will be fine on a AMD FX8320 @4.6Ghz, you wont get the most fps that 2x290 could offer in some CPU bound games but you will get improvements none the less and you can crank up the quality and AA settings :)
 
Last edited:
Im running 3x290 on a 9590 at stock speed at the moment, 2x 290 will be fine on a AMD FX8320 @4.6Ghz, you wont get the most fps that 2x290 could offer in some CPU bound games but you will get improvements none the less and you can crank up the quality and AA settings :)

You would need to step up to at least a 4790k for two GPUs and a 3930k or better for three IMO.

All of which is a total waste of time because it's the min FPS that counts in any game, not what's going on at the higher end of the spectrum.

I must have said this a thousand times yet no one seems to be able to let it sink in. I have a 8320 running either 4.7ghz or 4.9ghz stable for benches with a Radeon 7990 in it. I also have a 3970x clocked to the same (4.7 all day 4.9 benches)with a pair of Titan Black and in games the only difference I see is that I can shove 8XMSAA or whatever they call it these days and suffer no ills. The 7990 just isn't up to that amount of aliasing.

I own what? about five rigs. One is a 8320, one a Pentium Anni, one a hex core Westmere with 670 SLI and so on and so forth.
 
You would need to step up to at least a 4790k for two GPUs and a 3930k or better for three IMO.

All of which is a total waste of time because it's the min FPS that counts in any game, not what's going on at the higher end of the spectrum.

I must have said this a thousand times yet no one seems to be able to let it sink in. I have a 8320 running either 4.7ghz or 4.9ghz stable for benches with a Radeon 7990 in it. I also have a 3970x clocked to the same (4.7 all day 4.9 benches)with a pair of Titan Black and in games the only difference I see is that I can shove 8XMSAA or whatever they call it these days and suffer no ills. The 7990 just isn't up to that amount of aliasing.

I own what? about five rigs. One is a 8320, one a Pentium Anni, one a hex core Westmere with 670 SLI and so on and so forth.

I VSync at 60fps and i have been able to hold that in very game i play so more CPU power will make no difference to me.
 
I VSync at 60fps and i have been able to hold that in very game i play so more CPU power will make no difference to me.

Yup, indeed. People just don't seem to get that. They think that benchmarks rule all. They're nothing but a piece of paper that's meaningless. A what? 90 second run on a game and that then becomes everything.

Madness.
 
I VSync at 60fps and i have been able to hold that in very game i play so more CPU power will make no difference to me.

Ditto. I'll turn off vsync for the odd bench, out of interest, but the only really important thing is that I can hold down 60fps at 1440p, which I can currently do on every game I've owned without breaking a sweat (turning off vsync will normally reveal 100+ FPS).

That's the thing. When does lower relative performance become a bottleneck of concern? I'd say with FX-8 CPUs and CF 290s, it's certainly not a case of the latter.
 
Wasn't there a review from a respectable site kicking about showing an FX chip with multi gpu's out performing it's x79 or x99 (don't remember which) counterpart? I know I read it, just can't find it!
 
Wasn't there a review from a respectable site kicking about showing an FX chip with multi gpu's out performing it's x79 or x99 (don't remember which) counterpart? I know I read it, just can't find it!

At 4k yes. Only problem for me is that it would be massive aggro to test it.

I've got the kit, just don't have the time right now.
 
Back
Top Bottom