• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Going from FX8350 to 4770k

It's a shame amd didn't properly think about the pile driver as with 8 cores it should be a monster, if only the x79 6 core chips were cheaper
 
I went from FX8350 to i7 4820K. Believe me the difference is HUGE.

How huge? Did you do benchmarks before and after?

One of the best reviews of CPU bottlenecks I've seen is this one. Unfortunately it's a bit old now and only five not-particularly-new games and 3d mark.

At low/medium settings the Intel pulls away. At high/very high settings the i7 and 8350 are identical in some games, not in others. So unfortunately it's a bit of a gamble.
 
It's a shame amd didn't properly think about the pile driver as with 8 cores it should be a monster, if only the x79 6 core chips were cheaper

It has been out for over a year now with no news of it's progression path coming any time soon. Maybe the odd title will help it out if mantle supports it but tbh most sockets are dead if you were looking at some upgrade options.

For a low cost machine it still isnt too bad, I was converting some movies this week to another format and it done a great job. I remember having to leave an overnight job on older computers to rip a dvd.
 
So are the gains worth the extra money as il probably get around £200 if I sell motherboard and chip so I'm gona be looking at another £240 to get a 4770 and decent motherboard.
 
Without knowing what you play i cant tell you, but i own a 290x and 8320/50 and a 4770k, so i could tell you how heavy the bottleneck is with one card.

Example, when i played Ghosts on the 8320 and the 290x both overclocked but the 290x pushed to the limit at 1281/1551, the load in GPU-Z never went past 80% and averaged at low 70's during most multiplayer maps.

My 4770k never allows GPU usage to drop below 98/99%

I'd imagine if you play ghosts, you would see a massive performance boost especially considering you have two cards but some can argue that you can play ghosts perfectly fine at 1080p with an 8320.

What games do you play?

On BF4 you will probably see far less improvement as it can utilize the 8320 better but with programs that don't utilize it very well, like Valley, you will see huge boosts.

With the same GPU clock above and the fx at 4.8 i was scoring 2600~ on single GPU with the forum's predefined settings. The 4770k at just 4.4 with the same GPU clock lets me hit 3100~.

As fx 8 core utilization goes, BF4 and Valley are pretty much at two ends of the spectrum. Depending on what you play, one is more relevant than the other but also neither could be considered relevant, depending on what you play.
 
You have some great real feedback in A7F, Panos and Alex_123_fra all having owned both systems.

All being said only the individual will be able to hold hand on heart and justify the price difference for the performance they want to see.

If you game at 1080p or have something holding the system back throwing top end hardware is a waste of money no matter what way you look at it.
 
I think watching the Tek Syndicate video from the 25/01/13 will debunk a lot of the inaccuracies particularly the power consumption (9m 25) and the gaming performance compared to an i5 in general.

$21 over three years (£12) is hardly a "huge" expense considering people are buying thief keys for £15... ;)

They mention the compiler bias at 3m 50 for those that take note.
 
Last edited:
The 8350 is good value for money, yes the 4770k is faster and better in a multi gpu setup. But not everyone can afford to go the intel route, yes there will be a bottleneck on the amd cpu with high end multi gpus's, but 99% of users wont notice it it. It's not a game breaker. Heck in certain games, my old i7 930 does better than my spec in sig with multi gpu's. Be happy with the kit you have.:)
 
Last edited:
I would be also considering selling one of the 290x's and get a quality display (unsure of what you have) or something to finish the system off. The other option is to hold on to this money and start saving a stash for the latest cpu/mobo/ram bundle for 2015.

You can still get great gaming done on one GPU with your system.
 
Everything today is pointing to CPUs being made pretty much redundant for gaming. Whilst Mantle hasn't exactly got off to the best of starts (games need to be coded for it from the start, not added in as a sloppy patch at the end) the one thing it has managed to do is rattle Microsoft, who are now talking about DX12, and, touting it as an API akin to Mantle.

And this is big news for gaming because once again it will leave the API to attack your GPU/s not your CPU.

IMO switching from an 8350 to a 4770k is pretty much silly. I run my 8320 at 4.7 or 4.9ghz depending on what mood I'm in and I see absolutely no issues at all. In fact, just recently I put it up against a 2ghz hex core Xeon and was amazed to see my 8320 beat the Xeon in absolutely everything.

Not that it matters of course, because the Xeon can more than handle itself but yeah, you're chasing astronomical figures here that won't make the blindest bit of difference to your overall gaming experience. I run a 7990 and my gaming experience has been bloody marvellous.

Any way, your money your choice really.
 
How huge? Did you do benchmarks before and after?

One of the best reviews of CPU bottlenecks I've seen is this one. Unfortunately it's a bit old now and only five not-particularly-new games and 3d mark.

At low/medium settings the Intel pulls away. At high/very high settings the i7 and 8350 are identical in some games, not in others. So unfortunately it's a bit of a gamble.


Any game that doesn't support properly multicore CPU you will see benefit to it.
As I wrote, only a handful of games do so and most of them came out over the last 6 months. (BF4, X-Rebirth etc).

Take Crysis (any), Total War (any incl Rome 2), World of Tanks, War Thunder, any MMO, any RPG (Elder Scrolls, Fallout etc), basically any game that came out.
Plug the same GPU to both systems and run it, you will see that the i7 (and in many cases i5) CPUs are far superior by a big margin.

And I will give you three good examples. (all at 1080p with W8.1)

Crysis (1,2, Warhead), on FX8350@5Ghz with 2 7950s @ 1100 produces the same fps as an [email protected] with a single 7970 @ 1100
(was the benchmark that prompted me to change from FX).
And with a 4820K at 5Ghz (like mine), with a single GTX780 @ 1293 provides even better results. (mid 80s all day along on max settings).

Get WOT (World of Tanks) that supports a single GPU, single CPU core. FX8350@5Ghz with GTX780 @ 1293 the fps hovers around 50fps, on everything maxed out with mid 70s on sniping mode.
i7 4820K at stock speeds, with the same GTX780, normal fps on everything maxed out around 80 fps, sniping mode 110-120 fps.
At 5ghz, the fps is the same as with stock speeds surprisingly.

Total War Rome 2. Not much to say here, I played it before patch 5 (with CF support) and after with both FX and i7.
Its completely different game. And with this one started getting worried with the FX perfomance, when my laptop i5 Haswel was processing the turns much much faster than the FX at home. And when switched to i7 that was obvious.


While "benchmarks programs" aren't good indication about what I do 99.99% of the time. Which is playing games, look my results on uniengine valley, or the cinebench, in this forum where I pitted both CPUs.


And I write all these, because I had an FX8350@5Ghz, with 2 7950s and saw that big difference when switched to 4820K and a single 780.
Everyone who has done that switch between FX to i7, will say the same.

And I am not an Intel fanboy etc. I only had 2 Intel cpus since 1992 when bought my first PC. A Q6600 and the i7 4820K. While I had 13 different AMD cpus.
 
Last edited:
Cheers for that Panos, should be very useful to the OP.

Still, if I had a system which did 50 FPS and could spend £200 to get that to 80 FPS (your numbers from WOT) plus the labour and faff of having to reinstall Windows, I wouldn't bother.
 
Panos has valid points but you need to carefully select how you are painting one component in a different light. If you used BF4, crysis 3, farcry 3, metro for example the FX holds it's own and even makes the i7 look expensive for little gain.

It really does depend on what you will be playing past and present. I also agree that artificial benchmarks are not a good indicator at all, but there needs to be a distinct point made that a £100 cpu should not really be compared to a £300+ cpu.

If you can afford two 290x's then you are a high end gamer. The FX is for people who game on the other end of the scale that can afford a discreet GPU. As Andy has mentioned earlier the GPU will play a bigger role going forward now, if one is playing 65fps and the other 79fps only a micro amount of people are going to notice.
 
The problem these days is that AMD CPUs are totally and utterly misunderstood. They really, really are. People harp on about how much better the Intels are but it's all nonsense.

I put this to the test recently with a 32nm Westmere EP hex core CPU. In every test I did (three, there was no point continuing tbh) the AMD battered it. Now granted, the Intel has a bit of a derped frequency (2ghz) but from what you read on the net a £100 AMD chip should never ever touch any hex cored Intel no matter what the speed.

Fact - many of the people reviewing the FX CPUs did not even know how they work, nor how to overclock them properly. Nor would they know about the Asus bug where if you set the memory voltage manually you are doomed, ETC ETC.

In reality these chips are simply great. So great that there's no real world tangible difference that you could see.

I mean really? the test data wasn't even funny. 393 in Cinebench from the Intel, 803 from the AMD. 6100 in Firestrike Physics, 10k from the AMD.

Yeah I know that a Haswell is a big chunk better than the AMD, but never, ever in a noticeable form. You would need to enable a FPS counter, and as soon as you do that you start off down a path that will only cost you loads of money.

These figures and data have no real world effect. It's just pointless numbers. The only number you want to ever watch is the minimum FPS. And I guarantee you if you don't enable the FPS counters and simply play the game looking for stutters you won't see any.

I dunno man...
 
But if you played a turn based game, such as Civ or Football Manager, I'm pretty sure you would notice the difference in turn times between an Haswell and FX CPU, so saying they are pointless numbers isn't really true.
 
I put this to the test recently with a 32nm Westmere EP hex core CPU. In every test I did (three, there was no point continuing tbh) the AMD battered it. Now granted, the Intel has a bit of a derped frequency (2ghz) but from what you read on the net a £100 AMD chip should never ever touch any hex cored Intel no matter what the speed.

You're being silly, of course if you under clock an Intel hex far enough performance will drop off a cliff.

The CPU you have is 2-3 generations behind, nearly 5yrs old and was never even released to retail, it is clocked so slowly that even the i7 920 at the time would have out performed it.

Why did you even waste your money on such a poor CPU? just because it's an Intel hex? when people say that AMD CPU's can't touch Intel hex's they mean the ones released into the retail channel which weren't severely gimped and can overclock decently, not 5yr old engineering samples clocked at half speed.
 
Last edited:
You're being silly, of course if you under clock an Intel hex far enough performance will drop off a cliff.

The CPU you have is 2-3 generations behind, nearly 5yrs old and was never even released to retail, it is clocked so slowly that even the i7 920 at the time would have out performed it.

Why did you even waste your money on such a poor CPU? just because it's an Intel hex? when people say that AMD CPU's can't touch Intel hex's they mean the ones released into the retail channel which weren't severely gimped and can overclock decently, not 5yr old engineering samples clocked at half speed.

My general point was that according to the computing world the AMDs are completely useless.. IE - can't even perform like an I7 920, for example.

I'm fully aware that the CPU I have is three generations behind. It's also not a poor CPU. It uses the 32nm Westmere EP process which is better than the one used in the Core I7 920. With six cores it's able to put up a very good show in apps that use the threads. I paid £48 for it. Could you do better CPU wise for £48? because I tell you now it'll demolish any £48 CPU.

I bought it because it's a 60w CPU. 60w, hex cored.. It absolutely loves Photoshop and rendering. Sure, it's nowhere near as quick as the AMD but it's using 1/4 the power which translates into a passive cooler that makes no noise.
 
At the end of the day I want to get the full benefit of my 2 gpu's,I play all types of games not just battlefield 4 so if the fx8350 is holding them back then I think I would need to move to intel.
 
Yes. As you can clearly afford a new i7 there is no debating. Plenty of thread on here, use the search.
 
Do you have a monitor to push the cards? If you are locked to 60hz 1080p then your monitor will be the main bottleneck and need replacing first (or at the same time :))
 
Back
Top Bottom