• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

This is bad for multi GPUs - Intel Core i7-5820K Features Fewer PCI-Express Lanes After All

Percentage of people running more then 2 GPU's is slim and people who buy this CPU are buying it because they can't afford it's bigger brother, and if they can't afford it's big brothers then they can't afford more then 2 GPU's.

My 2 pence
 
Percentage of people running more then 2 GPU's is slim and people who buy this CPU are buying it because they can't afford it's bigger brother, and if they can't afford it's big brothers then they can't afford more then 2 GPU's.

My 2 pence

Can they afford AV?
 
Percentage of people running more then 2 GPU's is slim and people who buy this CPU are buying it because they can't afford it's bigger brother, and if they can't afford it's big brothers then they can't afford more then 2 GPU's.

My 2 pence

Percentage of people running multiple GPUs in X79 = huge.

It's very rare you see a X79 'budget build'.
 
PCI-E 3.0 hacked to run or do you get issues?

You don't need to hack it, X79 with SB-E was the platform that introduced PCI-E 3.0, Because it was a new standard though a couple of mobo manufacturers did have issues, as a result AMD enable it by default and you have to disable it yourself if you have issues, Nvidia disable it by default and you can enable it if you wish.


Percentage of people running multiple GPUs in X79 = huge.

It's very rare you see a X79 'budget build'.

I'm running a 4.5GHz 4930K with 4x8GB sticks of DDR3 and a single GTX780, because that's how I roll!
 
PCI-E 3.0 is NOT officially supported on SB-E because it was released before the official PCI-E 3.0 spec was finalised and there are supposedly timing differences/issues, whether you will suffer issues relating to it is unknown but Intel backtracked and stopped advertising them as PCI-E 3.0 and NVidia saw fit to limit their GPU's to 2.0 for stability reasons. AMD didn't act but that's nothing new and just reaffirms that they're a tinpot outfit.
 
9ptpljW.gif
 
PCI-E 3.0 is NOT officially supported on SB-E because it was released before the official PCI-E 3.0 spec was finalised and there are supposedly timing differences/issues, whether you will suffer issues relating to it is unknown but Intel backtracked and stopped advertising them as PCI-E 3.0 and NVidia saw fit to limit their GPU's to 2.0 for stability reasons. AMD didn't act but that's nothing new and just reaffirms that they're a tinpot outfit.

So why does a GTX 690 run by default @PCI-E 3.0 on SB-E.
 
It is only one that does, Nvidia must think it will have no issues.

It is well known Nvidia made is so PCI-E 3.0 did not enable without you running the "FIX".
 
Does the new Mobo's Chipset have any PCI-E Lanes because the future new Z100 (or whatever they call it) is to have them for the 1st time.

I assume if the Broadwell CPU's has same or more PCI-E Lanes than current Haswell then its all good for Multiple GPU's and PCI-E SSD/M.2 etc, better than its been before anyhow.

So surely the Enthusiast End do not get a boot to the balls?

I'm sure on these new boards (X99) possibly even X79? that you can disable the CPU lanes through the bios and use the mainboards. I'm positive that even with the 28 lanes on a 5820k, you can use workstation boards (not sure which yet) that have lane extenders like they did on some X79's.

Current Haswell (4770k/4790k) only has 16 lanes.
 
The 5930k is poised to cost quite a chunk more than the 4930k does now.

I seem to recall some one saying around £480.

Wrong, the USD cost between 4930k and 5930k is no different. :)

GBP price will be higher as pound has being weakening all week, but Intel are not to blame for exchange rates.
 
Last edited:
Wrong, the USD cost between 4930k and 5930k is no different. :)

GBP price will be higher as pound has being weakening all week, but Intel are not to blame for exchange rates.

Thanks for confirming that gibbo :)
 
Last edited by a moderator:
It is only one that does, Nvidia must think it will have no issues.

It is well known Nvidia made is so PCI-E 3.0 did not enable without you running the "FIX".

Q What is a GTX 690

A It is two GTX 680s on the same PCB


Q If That is the case why does the GTX 680 not run @native PCI-E 3.0

A Lack of effort and interest by NVidia


The two GPUs on a GTX 690 also talk to each other using PCI-E 3.0 as well as the card talking to the outside world at native 3.0

Worse still why does the Titan and all other NVidia cards that came after the GTX 690 need the hack to run lol.

Very sloppy work there NVidia or is it just laziness.
 
PCI-E 3.0 is NOT officially supported on SB-E because it was released before the official PCI-E 3.0 spec was finalised and there are supposedly timing differences/issues, whether you will suffer issues relating to it is unknown but Intel backtracked and stopped advertising them as PCI-E 3.0 and NVidia saw fit to limit their GPU's to 2.0 for stability reasons. AMD didn't act but that's nothing new and just reaffirms that they're a tinpot outfit.

This basically, they ran out of time and the spec wasn't validated. You can blame big vendors for this as Intel was working for a pre Christmas shipment and had already been delayed due to issues with the native SATA controller. To be honest, I'm amazed x79 has lasted this long considering the state it was released.

Long live x79 :)
 
Last edited:
This basically, they ran out of time and the spec wasn't validated. You can blame big vendors for this as Intel was working for a pre Christmas shipment and had already been delayed due to issues with the native SATA controller. To be honest, I'm amazed x79 has lasted this long considering the state it was released.

Long live x79 :)

I have a strange feeling that X79 is still going to be the system for benching in 3 months time.

If Haswell-E clocks worse than the average 4770k it is not going to be good, I hope I am wrong.
 
Back
Top Bottom