10Gb for the home

Soldato
Joined
23 Mar 2005
Posts
3,863
I've been sniffing around after helping out with a network install and was wondering when we can expect to start seeing 10Gb ethernet in the home market. You can already get the Intel dual port pci-e cards for around £250ish, so all we're really missing is the first mobo manufacturer to add 10Gb ethernet to the sata3 and usb 3 shininess.
 
I'm used to moving pretty big files around from my server (WHS) to other machines (7-15Gb files) and often need to pass those to friends who need them (not unusual to fill a 1TB drive) - I'm getting tired of having to crawl under the desk to plug their drives into the server to do the transfer. ATM there is limited scope for it however...

Given that motherboards have already appeared with Sata 3 (600 MB/s) and USB 3.0 (400MB/s) (and add-in cards are dirt cheap), and we're starting to see ssd's push transfer speeds towards these limits, it's not going to be long before we see external hard drives (ssd) plugging in via esata or usb and sustaining transfer rates of nearly 5Gb/s (500MB/s) - I for one would be very happy if I could do that across the network from the comfort of my living room instead of having to scrabble around on the server <- the whole point of WHS is that you're not supposed to be poking around with the actual server box all the time!

To my mind that means that it won't be long before we need networks that can keep up (or I'll have to invest in a helmet to stop the cracked heads under the desk ;))
 
Last edited:
A 15gig file would only take 2 minutes anyway, if you need it in a minute, set up teaming. Although if you're using Windows Home Server, you'll be limited to single disk speeds, which would be about 120MB/s sequential anyway...

The cost per port is also about 20x higher than gigabit. This doesn't make sense on any level. Unless you really need more than 4gbit (500MB/s!) then you're better served by teaming gigabit ports.

Still, if you really want to waste your money, please do and then report back :)

Edit: Also many of the 8-port 10Gbit switches I've played with fail to switch at full speed to all ports. I think the worst crapped out at about 5Gbit/s per port, highest I've seen for anything under £3,000 is 8Gbit/s. Obviously these kind of prices are nothing compared to high-end kit.
 
Last edited:
We'll see it in the home if the prices get close to gigabit levels. Which probably won't happen for years because currently there's very little home demand for those sorts of speeds.
Personally I could make use of it - with an 8 drive NAS server and SSD's in my gaming comp I find myself moving large game installs about fairly frequently and my Gigabit caps out at 100MB/s, well below what the storage systems are capable of. However saving around 20 minutes a week isn't worth hundreds of pounds to me.
 
Many motherboards have 2 Ethernet ports, you could consider using them both (you can do various stuff to load balance between them). Obviously it then requires that the receiver does the same.
 
A 15gig file would only take 2 minutes anyway, if you need it in a minute, set up teaming. Although if you're using Windows Home Server, you'll be limited to single disk speeds, which would be about 120MB/s sequential anyway...

The cost per port is also about 20x higher than gigabit. This doesn't make sense on any level. Unless you really need more than 4gbit (500MB/s!) then you're better served by teaming gigabit ports.

Still, if you really want to waste your money, please do and then report back :)

Edit: Also many of the 8-port 10Gbit switches I've played with fail to switch at full speed to all ports. I think the worst crapped out at about 5Gbit/s per port, highest I've seen for anything under £3,000 is 8Gbit/s. Obviously these kind of prices are nothing compared to high-end kit.

Sums it up really. The other thing is most of the cheap 10Gbit capable units are actually doing cut through switching rather than store and forward which is designed for specific datacenter and enterprise applications and is likely to be a headache for home use
 
Most hard drives and NICs can't saturate Gigabit, let alone need 10GbE. Maybe in a few years, not necessary now.
 
What might be interesting is Intels Light Peak system - This runs at 10Gb/s and prices should be reasonable since they are trying to make it a standard to replace USB.
Cables are good for 100m too so even if switching gear is expensive you could just run an ad-hoc connection to your file server and route the heavy traffic over that.
 
Last edited:
While all the points about cost/benefit are noted, as alluded to by Zarf, the tech is always pushing on, and as soon as the hardware arrives we find new and exciting ways to use it. That's true of all areas of tech. Many people argued against multi core cpus (slightly different case I know), and the same old arguments were trotted out when Sata 3 and USB 3 were discussed 18 months ago - and look at the market now.

If the current nand shortage can be dealt with and intel's new process works out as well as hoped, we could potentially see big reductions in prices over the next 18 months. Once OEM manufacturers start specc'ing mid-level rigs with SSDs then I think we'll see a big change in the way we use our hard drives, and more importantly, what we expect from them.

I was thinking more in terms of the next 18 months, but TBH, I'm surprised we haven't started seeing these in use on the forum already. Given that you can get a switch for around £250, and a card for about the same, you could have end to end 10Gb for around £750... seems like a lot, but people have, and do, pay that same premium for a lot less benefit in the CPU and GPU forums - The i7 975 comes in at £786.99, and certainly doesn't offer 10x the performance of a chip in the <£100 bracket - and yet people willingly pay for it, same with the GPUs - just look at the expense of the top end cards, and again - nowhere near the 10x performance increase.

I guarantee if Fermi came out and was 10x faster than the 5870 people wouldn't bat an eyelid at a £750 price tag (and more I suspect) even if it gave no discernible improvement in games.

The bottom line is that networking isn't 'sexy' the way GPUs and CPUs are, so people tend to be more 'sensible' in their choices - necessary, rather than shiny is the driving factor...

Bring on network overclocking I say - GPUs are sooooo yesterday ;)
 
I'm really curious as to where you think it's possible to get a 10Gbit switch for £250 - the cheapest I've yet seen are rubbish SMC units going for about $3500 for an 8 port unit. The cheapest adaptor is around £300 and both those prices exclude transceivers.

It's not being done because the vast majority of people have absolutely no use for it and even those than do are generally more likely to to wait 10 minutes when they do need to transfer 50Gbit of data rather than pay thousands to be able to do it in one minute (if their storage was capable of it, which it isn't).
 
If you search for "MINT 3Com 4200G Switch 24 Port Gigabit 10Gb 3CR17661-91" you will find one - that's only got 2x10Gb ports - but it's a start - there are Dell ones that are very similar with 8 ports (2x10Gb)

(but thanks for the tone - much appreciated ;) )

are generally more likely to to wait 10 minutes when they do need to transfer 50Gbit

Granted true for Gigabits - but if you meant 50GBs (as in the volumes I often have to transfer) - then even that small an amount will take over an hour - longer than I'd like to wait when mates pop round to pick up a few things!
 
Last edited:
Well, I wouldn't have counted switches with two uplink ports as instead of two ports you'd be better off using a cable between the devices...

And at full gigabit speed 50Gbytes will transfer in around 7 minutes, so 10 minutes for overheads. If it's taking substantially longer than something's wrong with your setup. You'd have to be on 100Mbit for it to take an hour.
 
Well, I wouldn't have counted switches with two uplink ports as instead of two ports you'd be better off using a cable between the devices...

Is that easy to set up in a Windows 7 environment? Is it still as simple as setting up a direct connection in the network setup wizard? TBH if you can get anywhere near decent speeds out of a direct line I'd be tempted to do that between my server and my media PC - without the cost of a switch I will be seriously tempted when the add in cards drop to around the £100 mark! (Is it possible to daisy-chain devices like we used to do in the old days (when we had those old T-shaped network plugs - name escapes me - prior to RJ45 and switches)?)

And at full gigabit speed 50Gbytes will transfer in around 7 minutes, so 10 minutes for overheads. If it's taking substantially longer than something's wrong with your setup. You'd have to be on 100Mbit for it to take an hour.

Yeah - my face is red - bad public maths there - but I have had a couple of mates pitch up with terrabyte drives and leave with little free space, so as you can imagine that does take a while (especially as my network is not configured particularly well and only gives between 50-70MB/s (about half the theoretical bandwidth for my gigabit :( )

Granted the above won't be affected until we start seeing large nand based drives at cheap prices... but it will happen!
 
Last edited:
Is that easy to set up in a Windows 7 environment? Is it still as simple as setting up a direct connection in the network setup wizard? TBH if you can get anywhere near decent speeds out of a direct line I'd be tempted to do that between my server and my media PC - without the cost of a switch I will be seriously tempted when the add in cards drop to around the £100 mark!

I don't know, every other ethernet progression has supported a point to point link with copper crossover cables so I assume a 10Gbit card with Rj45 connections will also (only one I know of yet is the Intel E10G41AT2 which is around £300 each). I haven't tried it but it'd be odd if not.

Of course a dual port 1Gig will cost a third of that and still double your speed over 1Gbit (with caveats) - and you'll need a pair of SSDs in RAID0 to exhaust that bandwidth.
 
Even mechanical drives are pushing past the 100MB/s read and write - the next batch of SSDs are pushing the 200MB/s mark - bearing in mind that SSDs are almost doubling in speed every time the capacity doubles (simplistic I know) - it won't be long before point to point transfers between 2 single drives can saturate even a teamed dual gigabit board/card (although I will be trying teaming when I get my new board up and running - could be fun!)

You could use the same argument against USB 3 - with a theoretical throughput of nearly 400MB/s you could argue that there is nothing out there in home consumer land that needs the headroom. But remember these jumps are designed to cover large periods of time (it's been 10 years since USB 2 was introduced) - it is only logical that each 'standard' hardware jump is large enough to both warrant the upgrade, and protect for the forseeable future - especially in infrastructure equipment - exactly why I'd be laying CAT7 when I get round to ripping up the floorboards!)
 
Last edited:
Even mechanical drives are pushing past the 100MB/s read and write - the next batch of SSDs are pushing the 200MB/s mark - bearing in mind that SSDs are almost doubling in speed every time the capacity doubles (simplistic I know) - it won't be long before point to point transfers between 2 single drives can saturate even a teamed dual gigabit board/card (although I will be trying teaming when I get my new board up and running - could be fun!)

I think it'll be far longer than that personally, enterprise gear is generally a generation in front of home users in storage and it isn't at those kind of speeds yet.

Anyway, the switching is the issue, there aren't cheap multi port 10Gbit units available yet, those at the bottom end tend to be cut through units which aren't really suitable for home use as well.
 
Back
Top Bottom