We’ve long been promised the “best superfast broadband in Europe”, “the fastest broadband of any major European country” and more recently just “fast broadband” for every home.
Whatever you want to call it, communications regulator Ofcom defines high-speed broadband as offering download speeds of 30Mbps. The government, meanwhile, sees things a little differently. It was most recently pledging 10Mpbs speeds for all by 2020, with a caveat that people in rural areas currently have to apply for this “fast connection”.
The future of faster internet is often expected to come from someone laying fibre optic cable. It’s certainly reliable once it’s there, but obviously involves digging up roads and it can be expensive to get in the ground.
Some, however, see the next frontier of internet connectivity coming in the form of gigabit Wi-Fi – and one company that’s hoping to crack the technology is, of course, Facebook.
Gigabit Wi-Fi’s promise is that it makes use of higher frequencies to send data through the air at rates as fast as 7Gbps. Once someone can properly commercialise it, that’ll mean ridiculously fast Wi-Fi for lots of people.
But, the problem today is, these smaller waves are easily knocked out on their journey – even by a drop of water – so the software needed to ensure a steady path needs to be damn smart.
Facebook’s new distributed network routing technology Open/R seeks to solve that exact problem. It has been created by the company’s Connectivity Lab, which was launched back in 2014 to help Facebook meet its ambitious Internet.org vision of bringing good internet to everyone. Yes, that’s the same lab that has touted the idea of internet-beaming drones.
This rather-more-practical-sounding system has been developed to power the company’s multi-node wireless internet hardware, called Terragraph, which it hopes to roll out into densely populated areas.
Facebook’s Petr Lapukhov explains: “We build our networks segmented into multiple partitions, such that a failure in one partition does not affect the others. Furthermore, we work on rapid failure detection and mitigation, and we view the ability to roll new updates quickly as a factor that enables us to fix things more efficiently.”
Open/R will soon be open-source so that others can rapidly prototype new ways of delivering high-speed internet in unreliable conditions without having to head underground.