Reducing Internet Latency: The Long-term Challenge of Making the Internet Faster

Editor’s Note: This is a guest blog post by Andreas Petlund from Simula Research and the RITE Project. You can read more about the Internet Society’s work related to Internet Latency at

The world is waking up to the need for consistent low latency on the Internet. Some people, like Stuart Cheshire of Apple, have tried for decades to make the technical world think about latency when they design systems and standards. Recently, efforts like the Bufferbloat and RITE (reducing Internet transport latency) projects have been working on some of the problems that increase Internet delays. We’ve also seen great initiatives like the Internet Society Workshop on Reducing Internet Latency. Now papers are being written that urge the network community to increase its efforts to realise a near-lightspeed Internet in order to release the potential that such stable low-latency communication will give the apps of the future <>.

I recently got a question from a journalist when interviewed about our new video explaining sources of Internet latency: when can we have near-zero delay in the Internet? I could glean from his tone that what he hoped for was something like “next year”. I could hear his disappointment over the phone when I answered that we can make good progress within the next years, but that the big changes that would give us the Internet of our dreams could take decades.

So why is the progress towards that goal so slow?

The root lies in the distributed structure of the Internet. If we could deploy a “new” Internet tomorrow, with every component under the same all-powerful control, we would have our low-latency net immediately. The technology is there and has been for a long time.

If we discount this clean-slate utopia however, the road to low-latency happiness has obstacles related to the political, economic and technical domains.

ISPs have businesses to run and customers to keep. They’re very nervous about making changes that may scare their customers away. Since providing network services is a low-margin business, very often, _any_ change proposed will meet strong resistance. Such fear will help maintain a status quo that keeps all the actors at the same level. If we can educate and encourage decision makers within such organisations, the chances of deployment will increase.

For the technical challenges, any solution that will drastically improve the situation has to be widely embraced by the community in order to succeed. Not only that, but it has to support incremental deployment. Some legacy equipment will lurk in the shadows waiting to break your beautifully designed algorithm for low latency communication. So to increase the chance of success, solutions should be standardised and designed with implicit incentives for people to adopt them, even though there may be a phase with sub-maximal benefits due to lack of widespread deployment. In the meantime, there are many ways to make smaller changes that have less impact, but that can still cut some milliseconds from Internet response times.

An important element for increasing the chance of success for Internet latency reduction is to raise public awareness. When the benefits are known, there should be a growing pressure on the influential players to help with the low-latency efforts. My feeling is that we’re about to reach a point where Internet latency is no longer a topic only for small groups of people with a special interest. We’re witnessing a growing interest, much due to the Bufferbloat project’s work. In RITE, we have just released an informational video aiming for raising public awareness about the topic. We’ve included educational material so that it can easily be used in IT 101 courses, allowing a new generation of technicians to be conscious of the latency aspect.

I’m an optimist about this. My hope is that the raised awareness will motivate a collective effort so that we’ll reach agreement on changes that will transform the Internet –without having to wait until 2040.

Growing the Internet Technology

Why ‘Megafast’ Internet Often Isn’t (Video)

What’s the most important thing determining your satisfaction with your Internet connection? I’m sure a lot of people would say speed, and that’s not surprising as headline bandwidth figures have been the way many commercial Internet service providers have chosen to compete in the marketplace for subscribers for many years. 50Mbps must be better than 10Mbps, right? Superfast sounds really, erm, fast, right?

Well, in many cases, increased bandwidth won’t result in significant improvements to user experience and one of the reasons for that is related to latency, or delay. Measurements from Google show that upgrading your connection from 1Mbps to 2Mbps halves web page load times, but quickly thereafter we are into diminishing returns: upgrading from 5Mbps to 10Mbps results in a mere 5% improvement in page load times.

To help more Internet users with understanding this seeming conundrum, I’ve been working with the lovely folks over at the RITE project to develop a new video, released today, that seeks to explain the difference between bandwidth and delay, and the different ways that latency affects Internet performance.

Why not take a look at the video and then see how you do in the quiz? There are even more resources for teaching and other activities available at the RITE project website.

It’s great to see these educational efforts being launched, as they start to address some of the important actions identified during the Reducing Internet Latency workshop we held last year. There’s still lots more to be done though and, “getting smarter queue management techniques more widely deployed” remains a priority.

So check out the video today and share it with your contacts to help us all get educated about this challenge. You can also keep watching our ITM blog for more posts about tackling the scourge of network latency!

Enjoy the video!


Speeding Up the Internet – Reducing Latency



Get Happy! – Reducing Internet Latency

Latency is an increasingly important topic for networking researchers and Internet users alike. Whether trying to provide platforms for Web applications, high frequency stock trading, multi-player online gaming or ‘cloud’ services of any kind, latency is a critical factor in determining end-user satisfaction and the success of products in the marketplace. Data from Google, Microsoft, Amazon and others indicate that latency increases for interactive Web applications result in less usage and less revenue from sales or advertising income. Consequently, latency and variation in latency are key performance metrics for services these days.

But latency reduction is not just about increasing revenues for big business. Matt Mullenweg of WordPress motivates work on latency reduction well when he says, “My theory here is when an interface is faster, you feel good. And ultimately what that comes down to is you feel in control. The [application] isn’t controlling me, I’m controlling it. Ultimately that feeling of control translates to happiness in everyone. In order to increase the happiness in the world, we all have to keep working on this.”

Latency tends to have been sacrificed in favour of headline bandwidth in the way the Internet has been built. Later this year, together with the RITE Project, Simula Research Labs and the TimeIn Project, we are sponsoring a two-day invitation-only workshop that aims to galvanise action to fix that. All layers of the stack are in scope.

More details about the workshop and how to submit a position paper are available here. Deadline for receipt of position papers is June 23.


It's (still) the latency, stupid*