• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FX 8320

Overclock it as much as possible and you'll get decent performance. However, a good Intel chip will outperform you every time in pretty much every game.

You would be much better off with an i5 or i7.

Also remember that you are limited to PCIE 2.0 which, with 10XX series Nvidia cards has become a bottleneck and you'll never get 100% full performance out of the card the way you would with 16X PCIE 3.0.

So CAN you run a modern GPU on an 8320? Yes. Will you have the same performance as if you were running an i5 or i7? No. Overclock. It'll help.

Save up for Zen if you want to stay AMD. It will have 40% IPC improvement over what you have now and be well priced.
 
You do not say which card you are thinking of upgrading to.

Generally with an overclocked FX8320 at say 4200MHz+ you will see more return on your money and gaming by just upgrading the GPU at the moment. The difference between PCIe2 and PCIe3 is only a few fps or at most 5%.

As has been said nothing much in the way of mainstream CPU's will change for the remainder of this year and a new GPU release is much more likely by Q1/2 2017 as well.
 
I would say the FX can handle your single cards, but if you go to SLI/crossfire it will start to struggle. You would not pair it with the latest and greatest as nobodt should be plucking vishera chips new with it being dated tech and zen round the corner.

I disagree with the blanket statement "However, a good Intel chip will outperform you every time in pretty much every game" as not only are there many variables like clock speed but there are some titles that perform better with an FX. "Every time" is fanboy rhetoric.
 
AMD 8320 user here.

You'll get left behind in benchmarks such as Heaven by quite some margin. So yes, if you want to spend 100s to run benchmarks the whole day, grab the Intel.

But when it comes to games, I literally see no difference between my system and other Intel systems I've had a change to use. And believe me, if I feel that I could get a better GAMING experience using Intel, I'd be running a 6850 or at least a 6700 as we speak. I've got a 1080 and was quite curious to see whether the CPU might hold me back. Ran some tests a few nights ago on Witcher 3. The 1080gtx was at 99-100% the whole time while the CPU was anywhere from 40-70% iirc. I took some screenshots which I wanted to upload but never bothered to. That's about the most graphically intensive game I own. The only other games I've tested so far are Doom, Crysis 3 and AC: Black Flag. The 1080 rips through all at 3440 resolution with no issues. I've not yet played other games as I'm alternating between the ones mentioned above. Maybe I'll bump into something soon that'll stress the CPU. Crysis 3 was the game that made me upgrade my 965 BE. Still waiting for the one that'll make me upgrade the 8320.

Oh and lastly, my monitor is a 60Hz one. Whether the 8320 start struggling at 100fps I have no idea. But at 60fps with a 1080 I have zero issues (well, apart from Witcher 3 I guess).
 
Can't speak Intel Vs AMD but i've got an 8320 and got it up to 4.5ghz with ease, I do have a water cooled system though.

I only really play CSGO and have seen that the low IPC of the AMD does hinder performance somewhat.
 
Also remember that you are limited to PCIE 2.0 which, with 10XX series Nvidia cards has become a bottleneck and you'll never get 100% full performance out of the card the way you would with 16X PCIE 3.0.

Absolute rubbish!

www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

Sure, in this example Steve uses PCI-E 3.0 x8; but this is, for arguments sake, the same bandwidth as PCI-E 2.0 x16 (there's a handy-dandy little chart at the top of the article comparing the bandwidths of all the standards from v1 to v4).
 
Last edited:
Back
Top Bottom