• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FX 8320 analysis

The problem is mostly people judging the CPU by the software running on it. The 8320 is a more powerful CPU than an i5, but if software is using 100% of an i5 and 50% of an 8320, as with a lot of games, it's going to create the impression of a weak processor.
 
To be honest I have never seen a confirmed story of a haswell or ivy degrading.. But I do believe it happens and I am still afraid of it. But I'd also like to see a documented incident :)

ALL silicon degrades, just at a very very very slow rate when used as the manufacturer intended to the point where you would never notice especially as it will be totally out of date and not used by the time it shows.

Overvolting and overclocking accelerates this, as do running temps probably, but by how much depends on lots of factors. Some designs will be more susceptible that others. So what we're really saying is that you've never seen a confirmed story of a haswell or ivy degrading to a point where somone actually noticed!

I remember an Athlon64 that I had running OCd for years that one day wouldn't run stable without reducing the OC slightly. It might of even been something on the MB, but either way something had certainly degraded to a point where I noticed it.
 
The problem is mostly people judging the CPU by the software running on it. The 8320 is a more powerful CPU than an i5, but if software is using 100% of an i5 and 50% of an 8320, as with a lot of games, it's going to create the impression of a weak processor.

While I agree the FX83 has more raw power than i5's (Although over the 4670K, it's not much anymore) how else are you meant to judge it? It's the performance in the software out that counts really, it's why the FX8150 sucked, because it entered into a world it wasn't fit for.
 
Found this interesting but more needs to be done

http://www.overclock.net/t/1444040/thread-usage-in-games

Thread priority will play into that somewhat, say if the 5th core is busy in doing something in the back, the 6th core might jump up.

There's an Assassins Creed 4 benchmark which shows the CPU usage, and it might have X amount of threads running at the same time, but one of the threads could just to another core etc, so it could have used 7 cores, but it's still only 6 threads.

Really need to use an OSD like MSI Afterburner with the CPU usage churning away and log them for a prolonged period of time.

The games that AMD have had some influence in, will be decently threaded.

EDIT : The comments on that thread mimic my words :p
 
Last edited:
While I agree the FX83 has more raw power than i5's (Although over the 4670K, it's not much anymore) how else are you meant to judge it? It's the performance in the software out that counts really, it's why the FX8150 sucked, because it entered into a world it wasn't fit for.

It needs to be judged on hardware and software, not just software. If you're only interested in new games for instance, the performance of older games using only 2 cores is completely irrelevant.
 
You have to be careful in the way you judge the hardware if the software isn't there to take advantage of it.

"Yeah, my CPU's really good, but I can only use 3/4's of it if I'm lucky"

And the software for the gaming side has only very recently started to become more threaded really.

Yet the 8 core's have been available for over 2 years.

But that's somewhat in the past now, as some of the newer engines are about, and more threaded (But there's still a decent amount that still need that core for core performance, and they're running their latest engine of X dev)
 
Last edited:
I agree Mart,

I certainly won't be changing my 4770K anytime soon, before i got my 290X i tried out a 280X in my system, the same gpu my mate has now with the FX8320 which caused a bottleneck in arma 3.

I know it's not one of the best optimized game around, but it's a relatively new game and i must say, the different in performance from my 4770K and the 280X to his with the 8320 was quite a lot.

Enough to reassure me that i made the right choice in getting the 4770K over the 8320/8320 especially seeing as Arma is the main game i play.

On BF4 the FX8320 did brilliant even @ stock and his rig averages over 60+FPS on max and he's very happy, once overclocked it'll be even better hopefully.
 
You have to be careful in the way you judge the hardware if the software isn't there to take advantage of it.

"Yeah, my CPU's really good, but I can only use 3/4's of it if I'm lucky"

And the software for the gaming side has only very recently started to become more threaded really.

Yet the 8 core's have been available for over 2 years.

But that's somewhat in the past now, as some of the newer engines are about, and more threaded (But there's still a decent amount that still need that core for core performance, and they're running their latest engine of X dev)

I know these things, you seem to assume that I know nothing :p

My point is that people judge the CPU based on 1-2 core performance especially, when it is becoming less and less relevant. The CPU obviously doesn't get better or worse, the software just changes to make better use of its potential.
 
I think most judge processor performance on fps and cpu clock frequency, followed by cost followed by power consumption.

If i could hit 6.0Ghz with an i5 at 300w TDP then i'd go for it.
 
I agree Mart,

I certainly won't be changing my 4770K anytime soon, before i got my 290X i tried out a 280X in my system, the same gpu my mate has now with the FX8320 which caused a bottleneck in arma 3.

I know it's not one of the best optimized game around, but it's a relatively new game and i must say, the different in performance from my 4770K and the 280X to his with the 8320 was quite a lot.

Enough to reassure me that i made the right choice in getting the 4770K over the 8320/8320 especially seeing as Arma is the main game i play.

What speed was it clocked at?
 
There's definitely a bottleneck when we tested,

Spawned in the same location on altis using the editor, same setting used in both systems and both systems @stock.

Intel-4770K-280X was around 75-80fps gpu usage 95-100%

AMD-FX8320-280X was around 40-50fps gpu usage 40-60%

Overclocking on the AMD system from 3.5 to 4.4 results in the fps getting up to around 65Fps.

I'm not saying the 8320 isn't good guys, far from it, when overclocked it's performance is awesome for the price.

I was just a little shocked that there was a bottleneck using a 280X, wasn't expecting it to come close to the 4770k and wasn't comparing the two.
 
Depends what scene they've benchmarked to be honest, which is somewhat of a problem with reviews in general, I remember there was a GPU one where the 680 was above the 7970, but in another scene, with more GPU load, the 680 wouldn't boost as much, and was behind, unless the completely opposing FPS figures given by the benchmark and the user benchmark isn't enough indication.

I was just a little shocked that there was a bottleneck using a 280X, wasn't expecting it to come close to the 4770k and wasn't comparing the two.

I'm not, it's a good rig you've bought your friend at least if mantle kicks off etc and in new games it'll be a fine rig, but I'd have went my i5 rig like :p
 
Last edited:
I'm not, it's a good rig you've bought your friend at least if mantle kicks off etc and in new games it'll be a fine rig, but I'd have went my i5 rig like :p

Lol, yeah i think it's a very decent rig, think it'll have no problems playing most titles at 1080p.

Hopefully we'll start getting a few more games that are better optimized with the new consoles being similar.
 
I wish AMD would cut the price of the 8350 and simply drop the 8320, as it needs to be at 4GHz to be competitive in more demanding games. If the 8350 were £120 and the 9370 £150 it'd be a lot better.
 
The profit they're making on the FX8320 isn't going to be great, the FX8350 at the 150 is needed for AMD as it'll be their mark up maker (Along with the 9XXX for those who want "cherry picked" chips.) I quite like AMD's 83XX price wise, the 9XXX, perhaps a few drops.

That said, I get where you're coming from, people who buy an FX8320 now get an FX8350, and those who buy an FX8350 now get an FX9, could make more people get the FX9 I guess.

Why do they need to have a 4GHZ part at the FX8320's price? All Intel have is a flipping i3 up until about 140.
 
Last edited:
Well, I suppose a lot of people dislike buying the cheapest option (like wine in a restaurant) and will get the 8350 as a result. But the 8320 at 3.5GHz is just too low to do the processor justice.

I can't imagine the 9370 sells very well, and the 9590 barely at all. But they have good default clocks, and would be ideal if cheaper.
 
Back
Top Bottom