• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Socket LGA2011 Processor CHOICE

Can't see the value in this platform wity Ivy around the corner.

The value comes when you need more than 4 cores (or 8 threads). Information that has been leaked so far indicates IB will have a maximum of 4 cores, the same as SB. That's not surprising as it is basically a core shrink of SB.

SB-EP (SB-E Xeons) are expected to be released at a similar time to IB, so the choice then is 2-4 cores (IB) with low power consumption or up to 16 cores (SB-EP dual-processor) with a high power consumption. So if you want the multithreaded performance, you pay for years with your electricity bill, but you get to pwn everyone when folding or doing anything highly threaded :)
 
Oooooohhhhh, so the quad core version is a non K and cant be overclocked???

FAIL!
It's partially locked, you can still overclock it.

What I wouldn't be able to get over is the fact they all use 8 core dies and with the quad, 50% of your cpu is disabled/damaged during manufacturing.
 
Last edited:
This is strange.
Basically because of the price and the IB on 1155 early next year the upgrade path from my LGA1366 i7 920 is in fact to mainstream platform, not enthusiast. In fact enthusiast seems to of just changed to "Pointless and expensive".

So it looks like i'll be moving to IB LGA1155 around march-april time then.
 
SB-EP (SB-E Xeons) are expected to be released at a similar time to IB, so the choice then is 2-4 cores (IB) with low power consumption or up to 16 cores (SB-EP dual-processor) with a high power consumption. So if you want the multithreaded performance, you pay for years with your electricity bill, but you get to pwn everyone when folding or doing anything highly threaded :)

The current hex-core skus apparently have another two disabled on the die. At that price I'd expect to get all 8 cores. As it stands a sandybridge 2700k is still very competitive with this platform at a much cheaper price. This platform simply looks like a dumping ground for server chips that didn't make the cut.

The platform itself is missing native usb3. It also ships with a disk controller that was less than what was promised. I sense there will be a platform refresh very soon, making the current crop of motherboards look like poor value for money.
 
40 pci-e v3 lanes vs 16 pci-e v2 against the standard sb cpu. Also, quad channel memory + extra cache to attempt to offset the latency of it being bigger and a great way to get badly failed 8 core dies in the hands of the consumer.

I'd suspect some are failed, but a lot are just eating up too much power at the kind of clocks a desktop version requires.

Thing is, why on earth do AMD/Nvidia/Intel give a damn about TDP? Yes none of us want 13000W computers and the power cost to run them nor the psu cost.

But an extra 30-50W for significantly more speed, cutting two cores off a die, that in many cases are there and working..... just to fit inside the 130W barrier........ which is entirely made up?

If you took a poll of people who might buy it, 1/500 people would say they'd prefer an £700 130W hexcore while the other 499/500 sensible people would say, yeah, octo core please, £700, same price, nice, 160W, seriously, who gives a crap.

Bulldozer do use an impressive amount of power overclocked, Sandybridges use WELL over 200W at full overclocks as well.

Idle power is incredibly important, because frankly the vast majority of home users have their CPU in idle for 99.99999% of their life span. With good power gating, it doesn't matter if at load a chip uses 50 or 300W, because they could both use 25W idle.

Oh well, SB-E really is pointless for the vast majority and IB-E is massively worse...... Haswell, probably the first big update since Sandy bridge, and potentially if it is octo core, a similar jump as we saw from single to dual core, and dual to quad, will be right around the corner after it.

Why get a stupid expensive hexcore now, to get a stupid expensive maybe octo core Ivybridge-e, just to be able to get a Haswell a few months later and 1/3 the cost.

Anyone that games, or any other general home use shouldn't touch SB-E with a bargepole, a 2500 or 2600k is more than ample for all gaming and just about anything else.
 
I think it's a thermal and power limitation that the oem's have come to an agreement with intel on.
The more power a cpu uses the better the thermals and power delivery of the cooling, case, psu + mb has to be, this can also cause expensive warranty issues (for the oem) down the line if the systems are used and abused.
Not to mention the initial implementation costs being higher.
With all the bean counters everywhere, keeping under 130w allows for a good costings/performance balance that keeps the books happy. Also we live in a 'OMGC02EMISSIONS!!!111!' world, keeping the tdp reasonable and dropping the mainstream show you bother to be aware of the situation.

As you say though, sb-e is nearly pointless for even most enthusiasts. Added that the x79 is nerfed so much it's now a power hogging p67 and is going to look rather outdated next to Z77.
What I find QI is there's almost no point in ivy-e. With 130w tdp, no igp and a power hogging pch. It's clear where sb-e is positioned, if ivy's main improvement is both the gpu and a good drop in power consumption - one not featured and the other not a priority it almost only makes sense to go for mainstream haswell.
 
Bulldozer do use an impressive amount of power overclocked, Sandybridges use WELL over 200W at full overclocks as well.

Idle power is incredibly important, because frankly the vast majority of home users have their CPU in idle for 99.99999% of their life span. With good power gating, it doesn't matter if at load a chip uses 50 or 300W, because they could both use 25W idle.

Checking a review at Bit-Tech, and factoring in the power difference between ATX and mATX, we see that at idle BD uses approx 40W more than SB (taking 2600K as an example - if you look at the 2500K the difference is greater), and under load the power draw of a BD increases to 270W more than the SB chips. So I'm not sure how BD's "impressive" power draw and SB's 200W (which that review shows is actually an increase of 130W power draw, comparerd to BD's increase of 340W) when overclocked can be an argument in favour of BD (if that was the point you were trying to make?).

As this thread is about SB-E, which offers more performance FOR APPS THAT CAN USE THE EXTRA THREADS, and the power usage is still less than BD for greater performance levels, your arguments against SB-E are totally invalidated for anyone who can use the performane gains.

Try to remember, SB-E is the EXTREME (ie TOP-END) PLATFORM, it is not a mid-range gamer platform or a platform for browsing so should not be compared on that level.
 
Last edited:
I'm going for the 3930k as soon as they come into stock, looking at this review there isnt much difference once overclocked to the 3960x, so i think i'll be pretty happy with it for 3d rendering.

http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/4

and

http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/9

lightwave being my main app. Of course whilst its twice the price of a 2600k and not twice the performance its always the case that past a certain point its diminishing returns and thats fine, as long as it allows me to work quicker or at higher quality i'm happy.
 
They were talking about the 3820 with its partial overclocking mate not the 3930k which is an unlocked CPU with upto 40 Multiplier. Then you have the X parts 3960X which is fully unlocked with an Unlocked Multiplier.

annoyed

wonders if these people are lying

Here, 3930K:

Code:
http://www.evga.com/forums/tm.aspx?high=&m=1334958&mpage=1#1336053

quoting 45 / 46 multipliers on 3930K
 
Last edited:
I'm more inclined to think ocuk made a mistake when they listed the 3930K specs. As they are not yet in stock I'm inclined to believe that they have not had a chance to test one yet, and that once they have had the opportunity they will update the information. Users in another forum who actually have one to work with are morer likely to have the most accurate information.

Why would 2 separate people have reason to give false info? Surely if they don't have one they would say they have the faster chip (for bragging rights or whatever), and if they do have one they would not give false info to promote it to someone else?

my understanding has always been that the two have the same max multiplier, and that the extra cost of the 3960X is for it's higher base clock (which doesn't seem to make much differeence in practice when overclocking) and larger cache (and again, that doesn't make a big difference in many apps).

That wouldn't stop me buying the 3960X if I was building an X79 system now and had the money though :)
 
I'm more inclined to think ocuk made a mistake when they listed the 3930K specs. As they are not yet in stock I'm inclined to believe that they have not had a chance to test one yet, and that once they have had the opportunity they will update the information. Users in another forum who actually have one to work with are morer likely to have the most accurate information.

Why would 2 separate people have reason to give false info? Surely if they don't have one they would say they have the faster chip (for bragging rights or whatever), and if they do have one they would not give false info to promote it to someone else?

my understanding has always been that the two have the same max multiplier, and that the extra cost of the 3960X is for it's higher base clock (which doesn't seem to make much differeence in practice when overclocking) and larger cache (and again, that doesn't make a big difference in many apps).

That wouldn't stop me buying the 3960X if I was building an X79 system now and had the money though :)

Finally! Thank you ... This is exactly what I have been thinking all along ... it seems to me Intel was just looking for an excuse to fill their $1000+ slot
 
Back
Top Bottom