• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The 8 core showdown and analysis thread.

Soldato
Joined
23 Apr 2010
Posts
12,219
Location
West Sussex


OK so I have been smashing the data. This thread can serve multiple purposes really.

* To see if an 8 core CPU is a viable proposition to a gaming PC.

* To see core use in games over the past year, to see if things have changed.

* To give a purpose to the forthcoming 8 core Haswell E CPU in a desktop machine.

* To see if the rumours that Xeons are crap for a gaming rig are true.

I'm going to compare an 8 core* AMD FX 8320 CPU clocked to 4.9ghz (that cost £110) with an Intel Xeon 8 core 16 thread CPU (socket 2011) that I also paid £110 for. Then I am going to analyse which games actually make use of all of those threads and how well they load up the CPU.

NOTES.

First up I'm fully aware the Xeon only boosts to 2ghz under load. I can not overclock it, not even in tiny increments via the FSB because even 101mhz makes the PC stick in a boot loop. So before the Intel boys dive in with accusations of comparing totally different clock speeds; there's nothing I can do about it. It's not my fault Intel decide to lock their CPUs at given speeds and then set a price structure for speed.

It's not always about speed and figures. At the end of the day a CPU can be perfectly suitable for a task, even if it does not appear as good as another one. You'd actually be amazed just how little CPU power you need for most of the time.

Heat and power are not a part of this analysis. Simply because I don't care, nor do I want to become embroiled in a stupid argument. This thread is strictly 8 cores only. I don't care about, nor want to know your results with your overclocked 4770k. Remember - 8 cores. I no longer care about clock speeds and IPC. I want to see more cores, being used, at lower prices. What I'd ideally like to see is a 6/8 core CPU by Intel that simply drops into a socket 1150 without the need to buy ridiculous motherboards or ram.

Hey, a guy can dream, right?

OK. So let's get it on then...

Here are the specs to concentrate on. The AMD rig is as follows.

AMD FX 8320 @ 4.9ghz
Asus Crosshair V Formula Z
8GB Mushkin Blackline running at 1533mhz (offsets with the FSB)
Corsair RM 750 PSU
Corsair H100
AMD Radeon 7990 ghz
OCZ Revodrive 120gb running RAID 0
Windows 8 Professional X64 (note, not 8.1 !)

Then onto the Intel rig. Note, this was a rebuild, so components stayed identical barring the board and CPU.

Intel Xeon V2 Ivybridge. 8 core, 16 thread, 2ghz
Gigabyte X79-UD3 motherboard.
8GB Mushkin Blackline running at 1600mhz XMP
Corsair RM 750 PSU
Corsair H100
AMD Radeon 7990 ghz
OCZ Revodrive 120gb running RAID 0
Windows 8 Professional X64 (note, not 8.1 !)

CPU validations.

AMD



Intel



I start with some benchmarks. First up was 3Dmark 11

AMD result



And the Intel



And already strange things happen. The Intel scored a higher physics score (which pertains to the CPU) yet even though the Intel also runs PCIE 3.0 (IB) it loses out overall. Very, very strange.

Then it was on to 3dmark (13) AMD up.



And then it was the Intel's turn.



TBH that's bloody, awfully close. It's actually within the margin of error but I promised myself before I began that I would not obsess over one benchmark and become sidetracked running it over and over again.

OK, so round three, Asus Realbench 2.0.

Interlude.

Asus Realbench is *the* most accurate benchmark I have ever ran in my entire life. Instead of making their own synthetic, unrealistic benchmark they simply took a bunch of programs and then mashed them together. This way the results are actual real world results. As an example, test one is GIMP image editing. Then it uses Handbrake and other benchmarks to actually gain a good idea of what a system is capable of.

This is also the toughest benchmark I have ever ran. I can run Firestrike all day long, but Realbench absolutely tortures a rig to the breaking point.

I ended up having to remove the side of the AMD rig and aim a floor standing fan at it to get it through.

So here is the AMD result.



And the Intel result.



Wow. Now this one truly knocked me sideways. I never expected the AMD rig to win on IPC alone (GIMP). Even an I7 920 runs the AMD close in GIMP, but the AMD absolutely trumped the Intel all the way through.

And this, lads and ladies, is why Asus make very high end boards for these chips. Simply as Bindi (an employee of Asus) points out, the AMDs are actually very good CPUs.

Then it was on to Cinebench, and another surprise..

AMD



And then Intel.



The surprise? not that AMD won. I was actually very impressed with the Intel's performance, given it is clearly running at less than half of the speed it's actually capable of. I hazard a guess that this CPU could actually double that speed if unlocked and overclocked, which does make me a teeny bit excited about Haswell E.

OK so no set of benchmarks would be complete without at least one game. I decided to choose Metro : Last Light. You'll see why later when I get onto the part about core use, but here is the AMD's result.



And the Intel.



And it was finally victory to the Intel. Not by much, but Metro clearly absolutely loves the cores and wants as many as you can throw at it.

Due to this result I decided to keep the Intel. There are other reasons of course, this played a part.



Less than 14 watts idle, and I had real trouble making it use more than 90w under load. Temps are always under 40c no matter what which means the rig is now very quiet.
 
Last edited:
OK so it's time for part two. The comparisons are now over, now it's time to crack on and see what those cores offer us in gaming.

With this part of the analysis I aim to debunk the myths. So, right now I am a mythbuster. Those myths include, but are not limited to -

"Pah. You don't need any more than four cores for gaming. Only a tiny handful of games use that many"

"No games support 8 cores, let alone more"

"Xeons are crap for gaming because they're somehow different inside"

OK, so those are the usual off the cuff comments doing the rounds. Here's my take. Basically I would take a CPU that's clocked to 2ghz with 8 cores and 16 threads over a quad core CPU clocked to within an inch of its life. Games and applications that use the cores tend to spread it around, meaning you get the same sort of performance without using masses of power or generating tons of heat. This is exactly why I decided to put a Westmere hex into my 670 SLI rig. Simply because in all of the latest games (and I mean all of them ) core count produced very similar, or better, results than running a CPU to within an inch of its life, thermal limits and voltage tolerance.

This is also why I chose to keep the 8/16 Intel in the rig, rather than putting back the more brutal AMD. I don't usually care about power use tbh. My bills are more than affordable, even with the stuff I have. However, noise is always a bonus if you can eradicate as much of it as possible.

Now I know I'm probably completely alone in feeling this way, but tbh? it's what we should have been demanding for years. More cores, better threading, lower power consumption to get to the same place, etc etc.

Sadly up until this exact moment Intel have not offered us a massively threaded CPU *with* the ability to overclock it. You could count socket 1366 but TBH this, IMO, was an oversight and a mistake by Intel. Had it not been so they would have left the strap and BCLK separate to the PCIE and SATA clock in 2011. Nope, Intel wanted to make sure you were stuck on their K series quad core chips. So it's always been "You can have the cores, but not the overclocks... Or the overclocks without the cores".

We are still going to be left to one side of course. Intel are already making and selling 12 core 24 thread CPUs but expect to see those in a year or two, unlocked and rebadged as "Extreme" edition CPUs.

Right, so with all of that said let's see what these cores can actually do, shall we?

Firstly I will explain how I performed this research. In Windows 8 (don't even think about using Windows 7, it does not correctly address any more than 4 cores. Anything more is a bodge and an afterthought and does not work properly) there is a very handy little app in your task manager than can be split up to show you how your cores are being utilised.

Here is how it looks once you split it to the full amount of threads, this is also a 8 core 16t system sitting idle.



OK. Now note that each core and thread (so physical and logical) has its own box. The bottom of that box is 0%, the top is 100%. As the graphs fill up they display core use.

Core use is recorded for 60 seconds, so basically the method I used was to load up a game, wait for it to get to an intense part of the action (when the rig makes the most noise, basically) and then press ALT and TAB to return to the graph. At which point I simply take a screen shot.

Note though - Actual utilization will not be accurate because I am now exited from the game. The only way to monitor that accurately is to run an accessory screen and record it in realtime. Not something I will bother with (yet).

So the first game put to the test was Batman Arkham City.



And as we can see, there is plenty of activity across all 16 threads. What we are looking for though is for the spikes to look the same. This indicates even loading over all threads. Very few of these games do this, but, there is still plenty of activity on each thread indicating high usage.

Then it was onto Battlefield 3.



No surprise. I've tested BF3 on the AMD and it wanted at least six cores with residual load over anything more.

Battlefield 4



OK, so we know BF4 is an 8 core game. However, BF4 tends to lean a lot less hard on the CPU. It seems DICE have been doing some work to make sure the GPU gets the harder job. Good job, DICE !

The first genuine surprise of the day, COD : Ghosts.



OK, looks like it could well be a modern console port then. Then it was on to Crysis 2. Again, I already knew what was going to happen here...



Only really wants four cores, spends most of its time leaning on two. Not so good then.

Time for Crysis 3



Much, much better. However, I am also aware that different levels in Crysis 3 change the dynamics. Certain levels want CPU cores, certain levels leave them to one side and call on the GPU. This is why the AMD vs Intel results in Crysis 3 are all over the place. However, I still say you're better off with the cores tbh. Time for Far Cry 3



And we can see that it's kind of lame. Far Cry 3 may use the same engine as Crysis 3 but it's clearly nowhere near as demanding, or complex. I admit the results were taken right at the very beginning of the game and thus, are likely to change as the game goes on.

Time for Hitman : Absolution.



Is one of the better games to demonstrate that it can thread very well. However, Hitman is a benchmark after all. I wish more games contained benchmarks tbh. OK so it was time to try something older.

RAGE by ID, using the TECH engine.



Wow. Now this game is what? three? four years old? yet still likes to thread itself well. God bless John Carmack, the guy clearly wants what I want.

Now let's see what Metro : Last Light is doing.



And again we see nice even core loads. It's pretty apparent that Metro loves having cores at its disposal. It allowed a CPU that won in absolutely nothing to beat another CPU simply by threading itself properly. Good show !

Time for some Tomb Raider benching.



Again, wants the cores. However, never truly loads them up over more than 50%. So in this instance again the Intel managed to produce better results than the AMD. Simply because the core loads are low, but the more cores you have the more the load is spread.

And finally it was time for Wolfenstein : The New Order.



And once again we see a pretty even distribution over all 16 threads. This is because the game is running on ID TECH, which Bethesda and others have been using for a while. This means more support for the future.

Conclusion.

Phew. I'm absolutely bloody knackered now. But, we can clearly see that the "Four cores is enough" argument belongs where it should be left; in 2010. Things have changed, consoles have changed, the ports over to PC have changed.

This may well pave the way perfectly for Intel's 8 core chip. TBH? like many times in the past they've left AMD to do all of the hard work on something that they saw as a waste of time. Now though? they have no choice really. Games and apps are now becoming more highly threaded by the day, and users will demand processors that can make full use of this.

I could well have sat here and benchmarked even more games that use the cores. There are quite a few that I am aware of, but sadly I only have so much time in the day.

Hope you enjoyed reading :) may well be a bit of an eye opener :)
 
Your Xeon is only 1.6GHZ on 8 cores? Firstly you said 2.3GHZ, then you 2GHZ. That's not a bash, but rapidly lowers its raw performance output, at 1.6GHZ it should be slower than a stock 3770K all in.
Also, you're still mis-using IPC :p

1.6ghz no turbo. Intel do at least give you 4 bins over 8 cores, so it's easy to lock it to 2ghz.

2.3ghz is one core only IIRC, though it may have been two....

Either way, blame Intel for the confusion.. Don't worry though, AMD pull the same dirty tricks to make their CPUs sound like they're clocked higher than they really are.
re



^ there you go.
 
Last edited:
I wish you'd used MSI Afterburner for Core Usage statistics (So we can see core usage at the same time in figures).

But you've obviously went to some effort.

Yeah it's not realtime. It's more of a "Here's what happened over the past sixty seconds".
 
intel at 2ghz 4.9ghz amd all benches irrelevant.

Shows you didn't bother to read properly, or simply don't understand.

One CPU vs the other. One can be overclocked, one can't - that's not my fault.

Yeah, probably a pointless benchmark but hey, at least it shows the performance of two CPUs.

That's why I separated the core use statistics and kept that as its own entity, because that was the most important part.


Nice review thought I'm too lazy to read it all and I feel bad to contradict anything you say (especially when I don't have any tests to proveit) . However you suggest more cores are king but 1core at 8ghz would be faster than 4x 2ghz all other things being equal (unless I'm missing something).

I'd be interested if anyone could prove me wrong or link some benchmarks where it's tested so we can see the overhead caused by multi core (if any at all)

It's actually very very easy to disable cores on the Xeon. So, theoretically I could run a few benchmarks and cut cores as I go...

Watch this space ;)

I dont get it. Why am I looking at a comparison between one cpu at 2ghz and another at nearly 5ghz? if you can't compare like for like then I dont see how this is of any use.

They are being compared like for like. One comes unlocked, the other doesn't. Both are real products made by two different companies.

I could compare a Porsche to a Mini if I wanted. Possibly pointless, but still very relevant. If you don't understand it then just skip part 1. Part 2 is far more interesting.

Your Xeon is only 1.6GHZ on 8 cores? Firstly you said 2.3GHZ, then you 2GHZ. That's not a bash, but rapidly lowers its raw performance output, at 1.6GHZ it should be slower than a stock 3770K all in.
Also, you're still mis-using IPC :p

Also, you're twisting, (Unless one's stupid) no one thinks xeons are inherently crap for gaming (I certainly don't, I've put a 4C/8T 1230V2 in my brothers rig). I just would never chose the one you've chosen.

Re-quote. Sorry, was multi tasking and missed a couple of points raised.

1. GIMP only uses two cores, no matter how many you have. In this test it's all about IPC. That's how Asus coded it. Hence I was expecting the Ivy to win, because my 8320 can't even beat an I7 920 on IPC alone..

2. The Xeon crap for gaming bit.. That's for another forum mate. One with far, far less understanding than this one. I've been using Xeons myself for gaming for bloody years. 13 years, because at one point Xeons were massively better than any desktop product (mmm, yummy P3 Xeons with 2mb full clock cache on a Marlinspike).

OK so now I'm beginning to show my age :D
 
Last edited:
why not wind down the 8 core amd to match clock speed or choose a like for like cpu.

2600k,3770k could have been got for same price.


"One CPU vs the other. One can be overclocked, one can't - that's not my fault."

whos is it then ? :p you choose cpus that arnt balanced for a start then wonder why people pull the benchies to bits.

First up why not put the AMD under DICE? or LN2? clock the bugger to 6ghz and beyond?

Whose fault is it.. It's Intel's. Ever since Clarkdale they've gone around with their padlock making sure that everything is rationed out to make them money.

With 1366 they forgot to derp the FSB, we got boards that could run 6 core chips *and* overclock both. With 2011 they did a bit of both. Some chips could separate the PCIE bus and strap, others were locked.

Lock lock lock - that's not my fault !!!

All I could do here is make sure both chips were doing the absolutely maximum they can do. The Xeon is 2ghz, that's it. Underneath? heh, remove the padlocks and you could have some real fun, but as it stands it's completely down to Intel.

So the point I am making? subliminally, AMD are bloody fantastic CPUs for the money.

In a real world that makes perfect sense a £110 retail packed CPU should absolutely not even be able to beg at the heels of a £700+ one. Absolutely no no no no way. But you can buy a 5ghz FX for what? a hundred and fifty notes? and look what it can do.

Far too much for a £110 CPU, basically. According to the BS that flies around this chip should not even be able to get up from its ass, let alone haul ass and dance rings around a £700+ CPU.

And, through all of this you can see why AMD have been absolutely loathe to throw away and perfectly good CPU that performs incredibly for the money just to play games with Intel.

Don't expect them to replace the Vishera any time soon. What they did was do what Intel does, launch a server CPU into the desktop market. Only AMD have been patiently waiting for support. Once it comes, yeah, Intel will be faster for £900 but the fact will be you simply won't need a £900 Intel unless you really need the Epeen points, or, are running an absolutely stupid GPU arrangement.

Trust me, if the general population didn't get that AMD wouldn't be wasting their time running a production line for CPUs, that apparently are so crap no one actually buys them or uses them.
 
Just looking at your Cinebench 15 scores Andy and your 2GHz Zeon is only 20cp shy of a 4.8GHz 4670.

I see the point, if not many others are. Frequency is only part of the equation.

Overclocking used to be more valid 15 years ago when you could make a cheap part perform like an expensive one for a little money. Now you need to have an expensive part (for intel anyway) to be allowed to play. Excepting the recent pentium which was a bit underwhelming IMO.

Yup I'm beginning to come around to the more cores, less voltage, lower clock but still similar performance.

I can run both of my H100s with the fans at 5v. TBH? I could probably remove them completely.

I think this is the sort of thing AMD have been waiting on. Sure, they're never, ever going to get back the single threaded crown, but the more the merrier.. I mean, these guys have 16 core server chips at their disposal.

Very nice Andy, thank you for the effort of doing all that, it must have taken a while.

Really does go to show that cores count for more than a people think they do.

The whole question of just underclock the AMD, yes he could but then I would expect this thread would have been trashed by now as certain members of the AMD cliché would not be able to see their beloved AMD humbled so much without lengthy argumentative posts, :D as it is the Intel is only just behind with under half the clock speed.

I'm not underclocking anything mate. All that will do is basically tell us what we've known for years, Intel dominates the single thread.

Intel will just continue to dominate, well, themselves.

Derping the AMD wouldn't prove anything unless I was the Golden Child and had a CPU that was unlocked, when actually what goes on sale is locked.

Hence my point raised about why not throw the AMD under DICE and then run the tests again? why? because I can, because AMD have allowed me to.

There were glimmers of hope when Bulldozer launched.. It was good in, well, Winzip lol. But, according to legend, it shouldn't be good at anything.

AMD, barse ackwards, as per usual. Release a 64 bit CPU before there's even a proper 64 bit end user OS. Release an 8 core CPU when absolutely nothing uses it, then spend about five years getting it supported lol.
 
Last edited:
Will try and fit it in later mate. Right now the rig is in the hands of its owner :D

Edit, just realised matey, that's actually a part of the Asus Realbench.. Not sure the scores would be comparable, but it runs H264 encoding as part of the benchmark.



There you go. Core use was a bit erratic and it would pause at times for a few seconds. Could be memory speed or something.
 
Last edited:
well done Andy, i'm tempted to send you my old rig so that you can fix it for me ....stupid ****** *** :D

but i guess i have to do it myself :cool:

haha I'm down your way soon. Gonna be in Milford on sea for 8 days. Probably go to the market down yours :)
 
flipping heck !!!!!!!;) you're best going to Winchester market, but Lymington market is ok just a bit too small, New Milton is a dump though, they're the guys that messed up my rig..

i cant fix my rig yet because Overclockers wont send me my gear until they have the fans in stock, they wont split up the items...... i'm useless with the software, i like to build them instead

Didn't even know there was a PC place in NM tbh mate. Shows how long it's been since I was there !

My uncle got his first caravan at Shorefield in 1982. I went every year three times until 1999 when I went off to the USA. Have popped to Milford a few times when we were living in Dorset, but been a couple of years now..

My step daughter lives in Winchester. She's at uni with her fiancée so I'll be sure to check it out when I deliver the rig I gave them both (the 670 SLI one... Bit too big to take on the train hahaha)

So a 2Ghz Intel 8 core can compete with a 4.9ghz AMD 8 core...

So the new Intel 8 cores based on Haswell -E that are unlocked, will smash AMD's 8 cores back doors in. That's good to know, guess they will be worth the price tag then, awesome.

They'll never be worth the Intel price tag no. For the Haswell 8 core it would need to perform 9 times better than the 8320, and even you know that just ain't gonna happen. Ever.
 
Last edited:
you cant get pc components around here, only Wellington boots and blow up Sheep.............;)

There used to be a good sport store there just down from the train station. Don't think it's there any more...

Shame G&T never sold PC parts really haha, used to love going there in Highcliffe :D
 
** FACEPALM **

to be able to keep up with an intel CPU that is ~2.5x clocked lower you will have to triple the cost to add cooling etc. Then on top of that the years you would own the cpu, if for a server then that means, together with the cost of being able to run this at all, that it will pay for itself in 1+ years due to the electrical costs...

What on earth are you bunnying about now? Even in a server they're horribly too expensive. AMD do 16 core CPUs for £480, if that.

as a desktop, it's silly to currently have an 8 core cpu for gaming because the 99% of games are single core/dual core - only a few recent games use more cores.

Only that's not how it is at all. When I've got more time I will benchmark a load more games that I know full well use X threads. But you keep on believing that and being happy with your CPU.

You're one of those people who I could print data onto a brick, smash it in your face and you still wouldn't digest it.


If you would want a rendering PC then the intel is better for the same reason as for servers and also has 16 threads...

Intel is better, Intel costs more. I thought we'd covered that about a hundred times now. But still, if it makes you feel better keep saying it to yourself.

an i5 is better than an FX8 despite being only 4 cores and if you want 8 threads at non xeon upfront costs then an i7 will pawn...

I can go on and on

really this is the most pointless topic ever...

You've got the benchmarks now. Go knock yourself out, see what an I5 can do. I absolutely assure you, you'll have difficulty beating the AMD in anything. Clock it as high as you like, rattle on about your faster thread performance, but you'll have terrible trouble trying to beat the 8320.

Please don't go on and on. I'm actually quite sick of your narrow mindedness. Seriously, it's getting really depressing.

If the topic is pointless, then try controlling your fingers. That way if you control your bodily functions you won't look like a pointless troll.

If something doesn't interest you then simply abstain.

I look forward to your I5 benchmark results.
 
seriously you are totally brainwashed for AMD it's just not funny

LOL. That just shows how well you read this thread.

Tell me, given you paid such close attention, which CPU did I choose to keep and which one am I getting rid of?

There's only one victim of brainwash here mate, and I'm replying to him now.
 
Emm,ALXAndy has spent more on his Intel rigs than his AMD rigs,and decides to keep the 8 core Intel one and is being called an AMD fanboi?? Okay???????!!!!!:confused:

IIRC,he also has a six core IB-E rig too.

Yup, 3970x that originally replaced the 4.9ghz 8320 that has now been given to my lady and replaced with an 8 core Xeon.

But hey, AMD all the way. :rolleyes:

This is what happens when a troll enters a thread and tries to gain some leeway without realising he's savagely missed the point by not reading the actual thread.

Gotta love them skim readers.
 
Andy , :)

I Don't always agree with your ideology, but have to personally say the information you gathered is interesting. Thank you for posting that.

However, I like to be see the clear story. I think comparing only those two cpu's in those situations only shades one side, because of the lower IPC of the FX and the significant low speed in general of the Xeon. So being brutishly honest it doesn't show the whole story.

Ignore the comparison. Honestly, ignore it. That was just a "Because I could" set of benchmarks and nothing I took overly seriously. If you had a look over my FX 8320 Analysis I did last year you would see that it was far, far more thorough.

I needed to see what the difference was. So hey, brought you guys along for the ride. It's really as simple as that tbh. I mean, who compares a £100 desktop CPU to a £700 server CPU? it's completely discombobulated, but hey, that would be me :D

LOL looks like I just broke the American dictionary in Chrome with that word :D

To complete this point and clear up the whole situation you should have had a i7 4770/4790K and a stock 4690/4670K in the equation. It would then show the opportunity of the more powerful IPC ( i7 with powerful 8 threads ), but much lower threads in balance (on the i5 with 4x powerful threads) whilst comparing to the well gathered information you currently produced. But hey I understand its not possible and were not made of money, I would just like to see for conclusion. :D

If I had an I5, if I had an I7. Sadly I don't and if I build one more rig I'll end up single lol.

When I play games I play games. I don't run FPS counters and I don't bother scrutinising every frame. I'm hyper alert, so if a game is unplayable I'll soon know. But Frame Watching is kinda like bird watching. It can become a bit of an obsession and end up being expensive.

I've long been a bit of a core whore. Mainly because I know how beneficial having say, four pairs of hands would be over just having the one. The more you offload pointless tasks to gaming onto cores (like the fat bloated os sitting underneath them, for example) the more the others can be saturated with what matters.

Servers have used a huge amount of cores for many years now, there's a reason for it. It's far more efficient both in energy terms and productivity.

But hey, if I do ever come across an I5 I'll be sure to thrash it to near death :D

I will definitely engage in the core reducing thing though. I can also run game benchmarks (more than just Metro) and compare them to the data I still have.

What I wanted this thread to do was to show people just how many cores and threads recent games use. The whole "You only need four cores" argument is beginning to sound a bit old now. I've shown conclusively with every modern game I tested (trust me when I say, I have them all...) that this four cores only nonsense is just that, nonsense.

Off the top of my head? I know for sure that Sleeping Dogs likes a core or 8 too.


I generally do favour Intel, I am happy to admit that,, but i more in favour of factual information and the ability to learn. So to be conclusive , as mentioned before im not sure your results give the total scale of lower threaded, but significantly faster IPC balancing and when its available does it prove to be more efficient. ?

But i will conclude if you dont have the funds, and are not in a situation where you can afford and premium processor, then picking one with many threads and as high clock as you can afford is sound advice, but you would guess so.

Thanks for taking the time to do this Andy, and I hope you can honestly understand my points and take them on board.

Regards

I favour Intel. Every one favours Intel. I've had a metric tonne of Intel CPUs over the years. Sadly since their 'victory' (if you can call it that, but most of the children need that label to attach) their practices have become more and more warped, and certainly not in favour of the consumer.

Absolutely ridiculously priced CPUs at stupid low clock speeds for hundreds of pounds. Why? because they know the children must have that label to carry around.

It seems to me that an awful, awful lot of people in society have to attach themselves to what they see as a better product. The issue is in the world of CPUs the choices are lower than ever. Desktop CPU, Intel or AMD. No Cyrix, Via gave up...

I'll buy what makes financial sense. Just like when I was a kid growing up. My mother did not spoil me (well, she did a little) but we had what we needed. Big old difference between want and need.

Comparing what AMD make to similarly priced products that Intel make makes it awfully confusing. You need to see through an awful lot of BS to make an informed decision. I've been running AMD for about two years now. Was annoyed that the Xeon E3 1220 I bought was locked, and the turbo was locked too. £154, the 8320 obliterated it.

I have been absolutely loathe to spend more than £150 on a CPU for years now. There really isn't any need, not for a gaming PC.
 
I've got 2 Xeon 2.66 4 core, 8 thread CPUS in a old mac pro doing nothing, any good then for gaming on the premise of this post!?

Of interest to some is that ir recently moved from a 980X to a 4790K and I think the 4790 is far better for gaming.

I think they're Westmere right?

I'm running a 1.78ghz Westmere with GTX 670 SLI and it does a great job.

I'm quite amazed at how you can physically tell a difference between a 980x and 4790 in gaming..
 
A good while ago, think back to the Pentium 4, that was when Intel gave up on the brute force approach and have been drip feeding us ever since. ( yes I know that is a little unfair as there has been a few good steps, sandy bridge was one.)

For the technology they own they have far, far too many products.

If they hadn't been so mean with Hyperthreading (IE used it on all of their chips) then we would see better support for it. But their rationing creates far too many niche sectors of the market.

I was reading a magazine the other day, and basically it said that back in the day there were two types of CPU. A good one, and a crap one. IE - Intel vs say, Cyrix. There was also a massive difference in price.

Now though? you've got dual core, quad core, hex core, Hyperthreading, eight cores coming, many different types of all of the aforementioned...

It's absolutely ridiculous. Talk about making a market as confusing as possible. No wonder software support is as confused as it is.

Thanks mate, do you still have access to the FX? I'd love to see how that compares.

It's highly likely the realbench x264 test uses a different source and settings so they won't be comparable.

The bench runs the x264 prosess at above normal priority, so that probably explains the GUI being unresponsive, it might affect the task manager graphs aswell. It is possible the thread count needs increasing for all those cores but that's hardcoded so not something you could try.

Sorry it's been removed now. The rig is now in the hands of my lady, hence why I made the decision to stick with the Intel. Far less aggro to run a stock standard CPU that makes the rig quieter.
 
Last edited:
EDIT :

Put it this way, at 2GHZ, your Ivy 8 core should have the end total performance of a 3770K running at 4GHZ
Ideally, you'd want to test a 4GHZ Ivy against your Ivy 8 core in the games you've put across.

If the figures aren't within margin of error, then the extra core usage is a bit of a fallacy.

That's really what's required.

What I did not do was mash together a whole pile of FPS statistics. I could have, but I decided to concentrate more on core use. Remember - I'm paving the way for this 8 core CPU Intel are about to launch by addressing the issues people say they have with massively threaded CPUs.

It's always far more complicated than it actually sounds. Even benchmarking a game you have so many variables that it just makes your head spin. A classic example is Hitman : Absolution. Three times it stuttered at a given point during the benchmark and dropped the min FPS to 23 or so. Then I ran it again, min FPS was 39 or above. This drastically affects the overall 'score' yet in game it doesn't happen.

Was it drivers? just a certain spot during the benchmark? I know that Metro always stutters right around the same frame.

I've told you before, I'm not obsessed with FPS counts. I never, ever study the FPS during a game. I don't have Afterburner installed and I don't furiously obsess over this sort of stuff. I just load up my games, play and enjoy them, turn them off.

I bought Titans and a 3970x for that reason. Chances are I'm not going to get any stuttering or lag now so I can game without thinking about it.

When Crysis 3 launched people were obsessing over benchmarking it. In certain levels AMD 8 core CPUs absolutely wiped the floor with any I5, and even ran the I7s close. Go to a different level all of a sudden the Intel are winning.

To me there are far more variables than what CPU can score the highest FPS.

The bottom line dude? more cores are better. It really is as simple as that.

The new consoles have really slow 8 core CPUs. IIRC it's something like two for the underlying OS, then anything up to 6 for the games.

I know that core handling has changed drastically in Windows 8. So even if that's how it works in Windows 8 then at least you would have six cores not doing two things at once.

It goes back to my analogy about having six pairs of hands rather than just one pair. One pair of hands can take care of Windows, one pair can take care of something else leaving X pairs to do whatever they please. It's just a far, far more efficient way of handling everything, including games.
 
Last edited:
if you want this bench/test to be excepted go do them both at same clock 2ghz compare the benchies you already done at this speed.

that is all.

So I should spend the best part of three days to say "Hey look, this is exactly what I expected"

Yeah man, you'll have to forgive me for like, not bothering.

What I showed was how much of a rip off Intel's Xeons are, and that AMD CPUs are far better than the Intel-ites seem to think they are.

I expected a reaction to that of course, and a fair amount of the defensive and aggressive posts saying "wah wah wah, not fair !!!"

Only it was fair. Blame Intel. Maybe if more people did that rather than using AMD as their sole excuse for everything things would be better off.

But noooo ! mustn't tarnish the big blue god.

Edit. Oh sorry, I thought you said expected, what you meant was accepted.

Which is even worse. Why should I deliberately derp a CPU just so the blue boys can give themselves a polish?

hahaha ! sorry man, will never be biased.
 
Last edited:
Back
Top Bottom