1.4TB internet!

So BT have successfully tested a 1.4TB connection over 260 miles

http://www.bbc.co.uk/news/technology-25840502

To me, that is just a pointless speed. What is the maximum speed do you think will ever be useful in your lifetime?

The only possible use for such a large connection would maybe introducing extremely cheap internet whereby one connection serves 10 houses.

Huge towerblocks / hotels having every appartment or room with Gb connections? lol
 
I remember not that long ago when ADSL1 was "way more than you'd ever need" when people had sub 50k connections. As speeds increase, so will requirements. Youtube at 240P seems like a joke now, but would not have been possible on 56k dial-up and a struggle on original 512k ADSL. Now it's to 1080P and beyond.
 
I don't think any resources were diverted away from digging trenches to work in a lab with a supplier on some new tech.
 
There's no such thing as a pointless speed, any advances that can be used to get more data down the same fibres is incredibly useful, our reliance on connectivity is only going to keep increasing.

Not if the quality of the line is compromised. There's no guarantee that even years after such speeds become common, that the efficiency of the technology will be to a high standard.

Anyways, isn't the idea that in the future, 'light' will be the medium and everything will be available instantaneously?
 
This isn't "broadband" rofl...

It's just further advancement of existing technology (CWDM/DWDM/etc) already found in service-provider core networks, "alien super channel" is basically a mix of 40Gbit and 100Gbit channels mixed to provide an aggregate total bandwidth of 1.4Tbps,

It's designed to allow service-providers more flexibility in adding capacity in their networks, not for connecting end users or businesses... well, maybe not for 10-20 years?
 
This isn't "broadband" rofl...

It's just further advancement of existing technology (CWDM/DWDM/etc) already found in service-provider core networks, "alien super channel" is basically a mix of 40Gbit and 100Gbit channels mixed to provide an aggregate total bandwidth of 1.4Tbps,

It's designed to allow service-providers more flexibility in adding capacity in their networks, not for connecting end users or businesses... well, maybe not for 10-20 years?

Quite, it's got sod all to do with home broadband, the BBCs usual low standard of technology reporting strikes again...
 
Very true! :)

Hopefully they can utilise this in the future to increase both speed and capacity for much larger areas of the country than they currently do.

Not going to happen! The point of this is to provide a path forwards from the current 10/40/100Gig backbone circuits in their core network in 5 years time, it's got nothing to do with increasing speed or capacity for users, it's got everything to do with providing more scalable and economical backhaul links...

If BT need a 1.4Tb/s link today they bundle 14 100Gig circuits and job done. The speed is nothing special (LINX did 1.97Tb/s at peak yesterday), it's the way it's delivered...
 
Pointless for consumers...

Ideal for Corporate level backbone

44 uncompressed HD films in 'one second!' *drool

Pointless for corporate backbone too really - I'm responsible for the design of one of the busiest content hosting and delivery networks around (there aren't definitive figures but we have to be top-20, probably top-10) and I can't see any use for this in the next 5-10 years at least...

The biggest networks around are still only really in the infancy of 100Gig waves today but despite that todays equipment can, and on transatlantic links regularly does, transit 1.6Tb/s over a single fibre pair (160 x 10Gig waves is normal these days).

The Emerald Express cable system goes live this year and has a initial capacity of 100 x 100Gig waves on each of 4 fibre pairs for 40Tb/s total between Dublin and New York.

Most corporates don't even deploy regular WDM, most have no requirement beyond 10/40Gig...the purpose of this technology is so divorced from end users, business or consumer, it's practically irrelevant.
 
Pointless for corporate backbone too really - I'm responsible for the design of one of the busiest content hosting and delivery networks around (there aren't definitive figures but we have to be top-20, probably top-10) and I can't see any use for this in the next 5-10 years at least...

The biggest networks around are still only really in the infancy of 100Gig waves today but despite that todays equipment can, and on transatlantic links regularly does, transit 1.6Tb/s over a single fibre pair (160 x 10Gig waves is normal these days).

The Emerald Express cable system goes live this year and has a initial capacity of 100 x 100Gig waves on each of 4 fibre pairs for 40Tb/s total between Dublin and New York.

Most corporates don't even deploy regular WDM, most have no requirement beyond 10/40Gig...the purpose of this technology is so divorced from end users, business or consumer, it's practically irrelevant.

Isn't that the wrong mentality though? "Oh, who needs it now, what's the point of it, so let's just build something that adequate to what we need at the moment."

I think that's the wrong approach. Obviously it might make financial sense for a company paying out of its own pocket, but in terms of vision and mentality it's wrong in all levels.

South Korea has started work on 5G, to be commercially available in 2020. They don't need it (full HD downloads on mobile phones take 40 seconds at the moment, 5G will bring it down to 1 second), but they have calculated that it will enable business worth $billions to their national economy if they get it first.

It opens up possibilities that people cannot conceptualise right now, the 'unknown unknowns'.

I'm sure 30 years ago 48kbps was adequate for business needs and there was 'no point' in anything faster, yes here we are today with xMillion times faster speeds. no one needs to watch full 1080p in youtube, certainly didn't even think about it 10 years ago, but now people are willing to pay money for it.

Likewise, businesses did not need all this speed a decade ago because the Cloud didn't even exist as a concept, now it's all becoming so common that forces people to rethink what speeds they need etc.


EDIT: forgot to say, it's actually the same mentality as in any big public project. Look at Crossrail, platforms are designed to existing tube platform lengths because they expect a certain demand level to never be exceeded. Sounds reasonable until you learn that the expected tube upgrade 'new capacity' was projected to be saturated in 2017 or later (or something like that) but it has already been saturated since the end of 2012. It's shortsightedness to a massive degree.
 
Last edited:
lol I can't believe how some people don't think it's worth doing.

It's like saying no point researching faster space travel as we'll get there eventually.

As I said on the other page, faster is always better as long as you don't sacrifice quality of the line.
 
Isn't that the wrong mentality though? "Oh, who needs it now, what's the point of it, so let's just build something that adequate to what we need at the moment."

I'm not saying it's not going to be useful in future, I'm just pointing out that it's lab research which achieves something which we can do today but slightly differently. People have been transmitting 1.4Tb/s over a fibre pair for a couple of years now, over 10 times the distance and in production.

What this research is doing is increasing the optical density and decreasing the frequency required per bit per second. The speed is actually unimportant here, they could have transmitted 100Mb/s at the same density and it would have meant the same thing.

This development is technically important (and it's actually far more to do with optical physics than 'networking') but doesn't change anything today, what it means is that, when it's standardised down the road, it will be possible to increase optical density in backhaul. That's nice in theory but we can do 10Tb/s per fibre pair today and if you're laying 1 pair then laying 12 isn't significantly more work and is considerably cheaper and less trouble than bleeding edge optical fiddling. The density increase won't be economically useful for 10+ years as a result.

This isn't the news you think it is and that's because the BBC regurgitated a press release they were given and didn't understand properly so they added the words 'faster broadband' at random.

This doesn't enable new services, it's *only* 1.4Tb/s, there will be a 40Tb/s cable system crossing the atlantic on a sub 60ms latency path this year - that's news right now and might change things in the next year or two.

My point, overall, is mainly that it's rubbish journalism which is completely misleading.
 
lol I can't believe how some people don't think it's worth doing.

It's like saying no point researching faster space travel as we'll get there eventually.

As I said on the other page, faster is always better as long as you don't sacrifice quality of the line.

It's *NOT* faster. We can do several times that speed today, the BBC just doesn't understand technology properly...

It's important, but only for very long term technical reasons. What it does is slightly reduce the optical bandwidth required for a given amount of data, but optical bandwidth isn't something we're actually short off currently, in 10 years we *might* be and then it'll be useful. But this will absolutely not give you faster broadband...
 
What do we really need over a single connection at a consumer level?

Right now, enough for 6 students sharing a house to all stream/download a HD Film off Netflix, Sky Store etc while their PCs all download updates and while playing on a video game. ie 6 of each, while allowing for a little leeway when someone wants to check IMDB or is leaving a game downloading on Steam etc. Of course, all 6 (and a couple of friends) will be scanning Facebook every few minutes, along with needing bandwidth for emails/kik/iMessages. There should also be enough for at least two of them to be on Skype. They will also have files syncing back and forth between various cloud storage systems and Spotify playing some music. One of them may even be doing some work...

IMO an 80Mb fibre line will just about cover this, as in reality they're not likely to be doing all of the above simultaneously - but I've known student houses run into slowdowns with 80Mb fibre.

In the near future, the equivalent but accounting for 6 Ultra-HD streams, more cloud storage, potentially applications running directly from the web, higher bandwidth Spotify etc services and so on will all be needed. More of the above will also become "always on" so peak bandwidth will be more constant.

On a single-user basis, I've had 1 (****-poor ADSL at my student gaff), 8 (typical ADSL at my mum's), 40 (fibre at my current flat) 100 and 1000Mbps (both in halls) connections in the last 2 years. 1 is clearly useless. 8 is useable but not a lot of fun for multi-tasking, 40 is adequate for a small family with relatively high use - for me and my girlfriend it does the job just fine. 100 is where you start to really appreciate the throughput for larger downloads. 1000 (1Gbps) is definitely overkill, but god do you appreciate it when it's there.

I found that while with a slow connection I considered hour to download a film to be acceptable, and even with 100Mbps a few minutes was okay - once I'd tried a gigabit connection (part of an experiment in the uni computing department) I really realised how good a near-instant connection is. EVERYTHING is available right now, and while it seems impatient for someone to say they don't want to wait 5 minutes for a film to start streaming, I think you really do realise that the ideal world would have no streaming/buffering times whatsoever. If it's possible to do it, why shouldn't we have download speeds higher than we can fully utilise?
 
Back
Top Bottom