10 Gigabit networks

Along a slightly different line, but I'm just in the process of adding a 10Gbit fiber link between our comms/server room and main office.

It has not been a cheap endeavour - two 10GBit capable 24 port L3 switches at £1300+vat each. I found a deal that included the SFP+ modules for free, but those are normally ~£400 each, and this is for relatively cheap Netgear kit!

I considered getting our storage servers on 10gbit at the same time, but £400 for an Intel SFP+ DA adapter, £60 for a cable, it ends up costing an additional £1k to do two servers (assuming you have 8x pci-e slots available). 2-4x trunking will have to do for now, the 10gbit will come a bit later.

£4.5k just to provide a 10gbit "backbone" to a small business, makes me think it's a long way from home use - certainly a lot further than two years.
 
Up until last year I could understand the resistance to move on this, but now that SSDs have come along so quickly it is becoming obvious that the network infrastructure is becoming a serious bottleneck, even in the home. Even 10Gb is starting to look slow when you look at some of the stuff available now - an extreme example, I know, but things aren't exactly slowing down (TBH I was beyond the limits of my Gigabit switch 5 years ago running my scsi setup in raid 5 - now it's not just the enthusiasts who are affected - "standard" rigs are going to start to see problems - just look at the move to USB3 and Sata3, even that fruit vendor has seen the light with their new interface!):D

I take your point about the cost of setting this up, even at the small business level, but as was pointed out earlier - you don't need a 24 port switch in the home - I could set up everything I need for around £400 with kit off fleabay (point to point between the server and the main PC) - when that comes down to £300 I'm buying!

I guess 'network' just isn't sexy enough to warrant pushing - funny really considering the lengths people go to over their CPU / GPU / HDD setups, when arguably you could see a more tangible benefit from upgrading your network.

Just out of interest - most of the cards seem to be dual port - would they support teaming as standard?
 
Last edited:
I guess 'network' just isn't sexy enough to warrant pushing - funny really considering the lengths people go to over their CPU / GPU / HDD setups, when arguably you could see a more tangible benefit from upgrading your network.

What are these tangible benefits?

Most people manage quite happily with 100Mbit (or slower) connections.

The only data of any size on a typical home network is HD video. Once it’s copied to the server the network only needs to be fast enough to allow streaming. It would be nice to be able to speed up the initial copying to the server, but it would hardly be life changing.
 
Two big assumptions in this thread that are the killing of the idea.

1. Cavemanoc assumes he's an accurate represenation of the "home user" demographic, if you shift 1TB of data in a single stream from time to time you're definitely not that.

2. Private sector companies do things to make consumers happy, they don't they do it to make money. So by nature they'll price things as highly as they possibly can get away with.

Also a note on the 10gig cards, the cards themselves might well be under £200 but they almost all require plugin tranciever modules which usually cost as much as the card again.

Arguably if you're needing to shift 100s of GBs of data about you've probably not designed the network very well. Even in SAN scenarios where you need to duplicate 100s of TB of data between datacentres the actual link speed isn't hugely consequential as you tend to synchronously replicate which produces a trickle of data which can be spread over multiple aggregated links rather than dumping huge chunks of data over in one go which requires higher link speeds.
Generally you only up to the next level of link speed when you hit the threshold where the port count on aggregated links is proving more expensive. Port cost going up the closer you get to the core of the network.

tl;dr - The art of network design isn't about shifting as much data as fast as you can, it's about structuring things to avoid shunting data about that you don't need to.
That's why network architects get paid lots more than network engineers :)
 
My point is we are not prepared to wait while things happen in any other area of computing - look at how many threads there are in the CPU section droning on about how fast the various CPUs will render a scene, or unzip a file, or convert media from one format to another - these are all things that can be done for a fraction of the cost on older hardware if people are willing to do it overnight, or go away while the machine works. I'm yet to see a thread that goes:

"I'm looking at building a rig, it needs to be able to edit large video files and convert between formats... but I'm a patient guy and don't mind waiting while it does it - what's the cheapest option out there at the moment?"

How much cash are people willing to spend just to get that extra 10% performance out of a rig?

It just always struck me as odd that we are so aggressive about CPU/GPU/HDD performance, but are happy to chug along on slow networks.
 
I think Hard Drives best illustrate the point I'm trying to make. About 5 years ago I was running a set of Seagate Cheetah SCSI drives in a raid 5 array. It ran off an enterprise controller card that cost anywhere up to £700 new, but I sourced 2nd hand on the bay for around the £150 mark. The drives and the card were expensive together, but the performance was stunning. At the time there were only a few of us that thought it was worth while - again, no-one cared about HDD performance. I saw the benefits every day - I was always 1st into BF2 maps, never had any stuttering in Gothic 3, and video editing was a dream. But no-one else cared about HDD performance beyond buying a raptor.

Fast forward 5 years and look at Hard Drives now. It has become a massive headline item, now that it's easy, everyone is doing it. Same was true for Water Cooling in the early days, no-one was interested until it became easy, now it's the norm in these sort of communities.
 
That's not really a great comparison, for the point I put forward.

Even in the world of PCs there's ways to avoid buying tip top hardware and still get the result you want.

Taking your example of CPU performance, you could use your non-multithreaded .zip reader/video editor and get the fastest CPU on the market... Or you could get an entry level multicore CPU and change the software to one that's multithreaded and makes better use of the hardware.

That relates pretty directly to the fast link vs multipath scenario. You don't NEED 10GbE, there's nothing a home user wants to do that can't be done by other means more cheaply.

But we are still talking about the top 1 or 2% of the market who gives a crap about this sort of stuff. The other 98% of consumers buy their PCs off the shelf at PC world and have a network consisting of whatever wireless router the ISP gave them for free and are over the moon with it.

If there was an exploitable market for residential 10GbE right now, the guys at intel and broadcom who spend all day looking for just such a thing would have found it already. You and your kindred 2% of the market won't make it a viable project. Unless something happens that suddenly grows that 2% to 20% no one will be interested in creating a new product line. Such growth at present with earnings stagnant and cost of living going up isn't likely to happen quickly at all.
 
10Gb for the home is just mental overkill at the moment.

I work in engineering product development for an ISP and one of my roles is building and load testing solutions, so this involves setting up a network and hammering it until it breaks..

You typically need quite a lot of 'application state' or many many sessions to load up big links, for the home user connecting a few PCs together, 1G Ethernet is normally plenty - 125MBps is a lot of data and even though SSDs will exceed that - application and software layers tend to slow things down a tad.

I reckon it'll be 10 years until we start to see any kind of inroad for 10G network cards and 10G dlink switches from PC world - you have to remember, 1G has been out since 1998 and it's only in the last few years that cheap 1G switches have been readily available..
 
...Taking your example of CPU performance, you could use your non-multithreaded .zip reader/video editor and get the fastest CPU on the market... Or you could get an entry level multicore CPU and change the software to one that's multithreaded and makes better use of the hardware.

That's not the point - people aren't making the choice in isolation like that - they're buying the best hardware available to gain a performance advantage that will be barely noticeable in most cases. Just look at the excitement over the 2700K - it's a great chip, it clocks fantastically well - but do you see material gains? Not on the scale I'm talking about with the network upgrade - an extra 10% gain on BF3 frame rates (if you're lucky) is not in the same league as a 10x increase in through put!

That relates pretty directly to the fast link vs multipath scenario. You don't NEED 10GbE, there's nothing a home user wants to do that can't be done by other means more cheaply.

Agreed. But by that logic there would be no justification for ever buying more than the minimum spec. "You don't NEED a 580GTX, a 280GTX will still run that game" People justify the price hike from the 6950 to the 6970 based on tiny margins of performance, yet won't spend similar amounts of cash for a 10x increase in speed in a different area? The fact that it can be done more cheaply doesn't mean it isn't worth spending more to gain better performance - this is Overclockers after all, we often spend huge amounts chasing diminishing returns - never mind a 1000% increase!

If there was an exploitable market for residential 10GbE right now, the guys at intel and broadcom who spend all day looking for just such a thing would have found it already. You and your kindred 2% of the market won't make it a viable project. Unless something happens that suddenly grows that 2% to 20% no one will be interested in creating a new product line. Such growth at present with earnings stagnant and cost of living going up isn't likely to happen quickly at all.

Not sure I agree. We don't represent a huge part of the market segment, but we are a very profitable one, with big juicy margins. Why else would we have 'E' and 'K' and 'Black' versions of the CPUs - most people don't overclock, but the big companies still produce the enthusiast hardware to cater to the minority - must be a business case for it or it wouldn't exist.

10Gb for the home is just mental overkill at the moment...

Not sure I can agree with this either. There is no doubt that home networks are starting to become a serious bottleneck. Even the mechanical drives are now edging beyond the capabilities of GBitE, not to mention SSDs. I just can't understand the resistance to this. Why on earth would you be willing to spend hundreds of pounds to squeeze a couple of extra FPS from your latest game, not not be willing to spend anything to be able to transfer a BluRay file in 30 seconds instead of 5 minutes. That's a huge, tangible improvement in performance, I would go as far as to say that even at today's prices there is no other upgrade you can do that will have the same % improvement per £
 
Last edited:
Five minutes, or even ten minutes, to transfer a Blu-ray video is fast enough as far as I’m concerned. Start it copying and then get on with something else while it finishes.

Upgrading to Gigabit is a no-brainer. The cost is low, and you can knock 50 minutes off the time it takes to copy your Blu-ray compared to 100Mbit. Spending even £100 to knock another 5 minutes off doesn’t make sense (to me). Definitely a case of diminishing gains.
 
Surely you'd need to be moving HUGE amounts of data to even make use of it at home. Hell, my network is 100mbit and I don't find it's slow.
 
I hear what you're saying, I just don't see a 1000% increase in speed as a diminishing gain :D

Looking at it in percentage terms is the problem. It’s the sort of statistical abuse normally inflicted by the tabloid press.

If you assume a file copy takes 50 minutes on a 100Mbit network. Going from 100Mbit to Gigabit will save you about 45 minutes, a significant amount of time.

Going from Gigabit to 10 Gigabit would cut the time from about 5 minutes to about 30 seconds, a saving, but much less significant one.

If you had the option of going to 100 Gigabit (and could saturate it) you’d cut the time taken to about 3 seconds. Saving you a fairly insignificant 27 seconds.

If that’s not diminishing gains…
 
And what if this file copy took 10 hours over 100 Mbps?

If my rough calculations are anyway near that's somewhere between 400GB and 450GB of data.

It’s not the sort of file copy that many people will do very often.

Over Gigabit it would take an hour which seems reasonable to me.
 
Surely cheaper to make a compromise on speed and team some 1GBit NICs between your key networking components (Be that server, NAS or otherwise).

10Gbit to the client, or infact anything more than 1Gbit seems stupid.

If you NEED that to your clients why do you even have servers/NAS? I suggest you re-evaluate your storage and networking design if you need that much to a client.
 
Surely cheaper to make a compromise on speed and team some 1GBit NICs between your key networking components (Be that server, NAS or otherwise).

I’m quite prepared to be proved wrong on this, but…

Teamed NICs won’t double the throughput for a single user. Especially if the transfers involve large individual files.

If the connection is saturated due to multiple simultaneous users then teamed NICs could help.
 
Not sure I can agree with this either. There is no doubt that home networks are starting to become a serious bottleneck. Even the mechanical drives are now edging beyond the capabilities of GBitE, not to mention SSDs. I just can't understand the resistance to this. Why on earth would you be willing to spend hundreds of pounds to squeeze a couple of extra FPS from your latest game, not not be willing to spend anything to be able to transfer a BluRay file in 30 seconds instead of 5 minutes. That's a huge, tangible improvement in performance, I would go as far as to say that even at today's prices there is no other upgrade you can do that will have the same % improvement per £

As i've already said - you can have the fastest drives in the world and 10G connecting the two machines together - but most of the time, networking protocols and application overhead will never let you reach anything like that sort of speed between two single desktop machines - so it's a waste of time.
There are always big differences between theoretical speed and what you actually get.

I 100% Guarentee, if you were to spend what would probably be more than £1k on a pair of 10G NICs, connect two PCs together with new generation SSDs then started transferring files, you'd seriously struggle to get more than 1-2Gbps of throughput.



Regarding your second point - Why would I spend loads of money to squeeze the last FPS out of the latest game then skimp on 10G?

Well for a start - playing games is lots of fun and the main reason I have a PC.. Having the ability to transfer blu-rays in 30 seconds has never really crossed my mind.
 
Last edited:
People thought the same way about Hard Drives up until a couple of years ago - no-one cared apart from a small group of us messing around with SCSI and 2nd hand controller cards.

I think you are muddying the water a bit by using terms like 'client'.

Since (some of) you guys seem to struggle to think of uses for this I'll use my own case as a quick example. I go away (a lot) for extended periods of time. When I travel I like to take movies/music/games with me to help pass the time. Invariably I will forget to transfer them from my server (WHS) until the last minute and then always get the comedy box telling me it will take 3 hrs to transfer my files. At that point (with minutes few to wheels) the difference between 3 hours and 18 minutes is a LOT!

Now granted I couldn't saturate a 10Gb pipe with my current rig - I'd be lucky to get anything over 300MB/s onto my creaking Raid 0 SSDs on the laptop, let alone an external enclosure. But the point is that even on my old rig (3 years+ XPS m1730) my network is the limiting factor. I could and would use a faster connection if it were available.

The bit I find surprising is that people on this Forum, of all places, seem willing to accept that limitation. Just look at the excitement over boot times in the HDD section now that SSDs are common place. If 10GbE was available at a reasonable price, would you really turn it down because you seldom use it and are happy to wait?
 
Last edited:
The bit I find surprising is that people on this Forum, of all places, seem willing to accept that limitation.
Overclocking is mostly not about throwing money at a problem. It is and was about getting something cheap or mid range and clocking it until the sparks fly out.
Of course there are some people who had access to dry ice and liquid nitrogen, but none of those people ran their day-to-day rigs on those types of cooling.

cavemanoc said:
If 10GbE was available at a reasonable price, would you really turn it down because you seldom use it and are happy to wait?
The fact is that 10GbE is currently expensive. There are no two ways about it.
A happy fact is that when the prices do some down some more, I can see people having network based devices as iSCSI attached storage using SSDs in the home. Suddenly spending £500 on 512GB of SSD storage doesn't sound so painful when all of the clients on your LAN can connect to and use it.

Until it is cheaper, you either design your network better so you don't have to move huge amounts of data, or pay the premium.
 
Back
Top Bottom