• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The 8 core Intel thread.

so basically intel beats AMD, no surpise...given that he CPU you chose is not very extreme and designed for servers running low spec but many many things at once...

Whilst the cpu costs more you can pretty much gaurrantee of low power cost and long life span and reliability and low low temps in tight spaces. pretty much what a server needs...

AMD at 4.9 is an oven whats the point...you can't run a server on that 7/7 365/365

Yes you could. Server rooms are usually air conditioned. Chip would have no bother running at that speed if it can make it through Prime95 and all of the other tests I ran on it to make sure it was stable.

And the Intel didn't beat the AMD, so not sure why you seem to be confused. In fact, in certain tests the AMD absolutely wiped the floor with it.

Clock speed, locked, low voltage - not my problem. Don't even try blaming that on me. £750 CPU. That would be down to Intel who obviously think that it's worth that sort of money (hint, it's not even worth the £110 I paid for it speaking purely from performance terms).

AMD still do quite well in the server sector, simply doing what they do everywhere else. Great value products that more than fit the purpose.
 
In a server room surly low tdp (and heat) is more important than ever
Less refrigeration etc

Tdp is one thing in a desktop where it doesn't matter. But lots of these together is going to cost.
 
And the Intel didn't beat the AMD, so not sure why you seem to be confused. In fact, in certain tests the AMD absolutely wiped the floor with it.

Clock speed, locked, low voltage - not my problem. Don't even try blaming that on me.

Interesting that this cropped up. I watched a bench video recently and on power consumption an i7 overclocked compared to the FX came not far off load in watts (236w vs 239w).

/Blinkers_on

:cool:

Tdp is one thing in a desktop where it doesn't matter. But lots of these together is going to cost.

You are on an overclockers forum, if your also pushing your GPU and other case fans the "savings" become only measurable in years it is really that simple. By then you would have changed components so it's a non-starter in my book.
 
Last edited:
Interesting that this cropped up. I watched a bench video recently and on power consumption an i7 overclocked compared to the FX came not far off load in watts (236w vs 239w).

/Blinkers_on

:cool:



You are on an overclockers forum, if your also pushing your GPU and other case fans the "savings" become only measurable in years it is really that simple. By then you would have changed components so it's a non-starter in my book.

That's what I mean
In your house is usually not an issue (ie I have plasma, htpc with 7850 an amp all on for few hours a day) all hot but doesn't really make a bank breaker

But in a server room with many many said chips all crunching data it's going to be a consideration (I would think)

That tdp if accurate is very close. But you would have to know if one or both etc were 'good' or 'bad' would you not?

How much power (at the wall) can a good vs bad make?
 
Last edited:
so basically intel beats AMD, no surpise. 365/365

The Xeon versus an overclocked FX83 isn't clear cut like that, the Intel would need to be utilising all 16 threads to be competitive. When it's using 8 or less? It's not going to be winning.

Which is pretty much why I said like last week, "You'd almost never pick this Xeon", and I'd pick the FX83 over it, pretty much every day of the week.
 
The Xeon versus an overclocked FX83 isn't clear cut like that, the Intel would need to be utilising all 16 threads to be competitive. When it's using 8 or less? It's not going to be winning.

Which is pretty much why I said like last week, "You'd almost never pick this Xeon", and I'd pick the FX83 over it, pretty much every day of the week.

And i would pick an i5 over the fx83... intel wins
 
My point alex is there is a myth about the intel chips 'power saving' users money if they are hammering their cores via overclocking. It is quite clear that when doing so the overall consumption is actually similar and defeats the purpose of using it as leverage in a poor man's forum argument of a moot point.

Server markets it's something for the companies to consider and is not what I was interjecting about, it's a common baloney point made on here and is rather irritating.
 
My point alex is there is a myth about the intel chips 'power saving' users money if they are hammering their cores via overclocking. It is quite clear that when doing so the overall consumption is actually similar and defeats the purpose of using it as leverage in a poor man's forum argument of a moot point.

Server markets it's something for the companies to consider and is not what I was interjecting about, it's a common baloney point made on here and is rather irritating.

since the xeons are not overclockable and to get AMDs to be competitive they have to be overclocked to the extreme. You be saving 200w of power on the xeon.

which means every 5hrs it is costing you 18p more for the cpu alone. With 8760 hours per year thats an extra cost of £1500 more to run the AMD...

therefore in 1 year it has paid for itself twice over...

This is why server based xeons cost so much.
 
Pretty sure your math is cray cray.

It's 315 pound at 8760 hours.

The only overclocked FX83 to overclocked i7 4770K results I saw were Bit-Tech's, and the difference was over 150W, but that's under Prime. So obviously the figure's skewed, as the FX83 won't be running anywhere near 100% when we're talking gaming (Due to most games unable to use more than half of it)
 
Last edited:
Pretty sure your math is cray cray.

It's 315 pound at 8760 hours.

The only overclocked FX83 to overclocked i7 4770K results I saw were Bit-Tech's, and the difference was over 150W.

your right i had 2 calcs open with different numbers so my bad.

im pretty sure though that a server will be running for more than 1 year lol more like 5+ years at least
 
Not arguing about the xeon consumption as martin has highlighted they will not be bought for gaming even by the crazy folk on here.

My point is comparing the intel desktop cpu's to the fx which shows when under load the consumption is not a million miles of difference. Therefore when gaming (or running extreme applications etc) you are going to take years to even be able to argue that you have 'saved money' via buying an intel cpu!

U3UdyIW.png
 
In a server room surly low tdp (and heat) is more important than ever
Less refrigeration etc

Tdp is one thing in a desktop where it doesn't matter. But lots of these together is going to cost.

Do you know how hot the Quadro cards based on Fermi used to get?
 
The Xeon versus an overclocked FX83 isn't clear cut like that, the Intel would need to be utilising all 16 threads to be competitive. When it's using 8 or less? It's not going to be winning.

Which is pretty much why I said like last week, "You'd almost never pick this Xeon", and I'd pick the FX83 over it, pretty much every day of the week.

Even when all 16 threads are used the results are still pretty much the same across the board, thanks to the derped clock speed.

Now I wouldn't mind if -

Intel did not sell 2ghz 8 core 16c chips for £750

They were overclockable, at least a little bit (say, 500mhz)

But they're not, they're real CPUs and they lose to a £110 CPU.

TBH though? I've picked the Xeon. Few reasons really...

You don't need a massive CPU any more. Not unless you're either benching or running a ridiculous GPU set up (and even then a 3970x more than accommodates a couple of Titans) and the Ivy uses absolutely no power (95w but I've not seen it go anywhere near that yet) and generates very little heat. After my time with the Westmere hex that only ran at 1.78ghz and liking how it never broke 35c I'm kinda fond of it.

I'm also prepared to gamble on core support, threading and so on. It's pretty apparent to me that the article about console ports is well in motion, so there's no point running a 'gas guzzler' any more.

It's also not my rig, so the noise would have had my lady ripping her hair out.

I'm finally done grabbing data now. If I can find time tomorrow I will post the results, but it's far more of a "How many cores can one use" scenario than the "Watch my AMD scream its guts out to beat a £750 CPU". I've not ran a ton of comparisons. I'm far more interested in paving the way for 8 core CPUs and how well they are supported (and will be into the future).
 
Last edited:
since the xeons are not overclockable and to get AMDs to be competitive they have to be overclocked to the extreme. You be saving 200w of power on the xeon.

which means every 5hrs it is costing you 18p more for the cpu alone. With 8760 hours per year thats an extra cost of £1500 more to run the AMD...

therefore in 1 year it has paid for itself twice over...

This is why server based xeons cost so much.

No it isn't, at all. It's because Intel are jokers.

First up you forget something - the AMD comes as a stock 5ghz CPU. That means you can ignore my overclocking, because AMD will sell you one clocked that high and completely reliable out of the box.

Secondly. The AMD isn't competitive. It conclusively won every single benchmark I ran apart from Metro : Last Light. And when I say close I'm talking a 2 FPS difference.

Your power stats are also a million miles out. But the power argument isn't something I'm even going there on, because it has no basis.

990X used over 200 watts when overclocked. I7 950 did about the same. 3970x? hahaha, I think 12 phases at 4.9ghz it's clearly going to be putting the AMD to shame dude. AMD chips don't even need 12 phases.

Power argument = null and void on me. Been there, done the maths, showed people up, end of argument.

The last time I spent some time with an AMD CPU people went awfully quiet. It's not their fault, it's simply that they've listened to too much garbage over the years and as such have avoided AMD. Pulling figures from the internet and mostly thin air does not count as concrete proof. The only way to get that is to do what I have done, gone totally brand unbiased and spent a fair chunk of my evenings seeing things for myself.

Don't even get me started on the rotten tricks Intel pull dude, just, don't.

And before we go back to the "But Intel are better" argument, see posts before. Yes they are, yes you pay for it in bucketloads, yes they're conniving aholes for selling thin air (IE the ability to overclock).

Clock locking is now a commodity to them, and something AMD have never, ever stopped people doing.

So as I said before, come back to me with your definition of "better". Lying? cheating? locking? rationing? high prices? 3d transistors made for laptops making it into the desktop market, that cost less to make because they're way smaller yet somehow always seem to just get more and more expensive?

Really, I suggest you spend a few days reading about your corporations and how they operate.
 
the point you're missing Andy is this :- you're peeing into the wind and it's blowing back into your face, people will only buy the fastest CPU or one that's 10% less, if they're smart...................unfortunately that's Intel, yawn yawn :)

lets hope AMD bounce back in 2016...... because i think they will, in fact; more like late 2015
 
Back
Top Bottom