Latency is adelay in the transmission of information over the internet. It can affect the speed of your connection, and can even cause you to miss important updates or notifications. Latency can also impact your online experience in a negative way, making it difficult to get what you need done. There are a few things you can do to help reduce latency on your internet connection:
- Use a proxy service: A proxy service helps you hide your IP address and encrypt your traffic so that only the people who you want to see see it have access to it. This will slow down your connection by making sure that all the traffic going through your proxy is sent through another server instead of directly to your computer.
- Use an VPN: An VPN uses encryption technology to keep your data private and secure while on the internet. This will slow down your connection by hiding all of the websites that you visit from prying eyes, and will also make sure that any website that does try to track or spy on you is blocked.
- Use a fast broadband connection: If possible, try using a broadband connection with lower latency so that you can get faster speeds without sacrificing quality or functionality. This will help reducelatency on both of your devices as well as improve overall internet performance!
There is more to an Internet connection’s speed than just its bandwidth. This is especially true with satellite Internet connections, which can offer speeds of up to 15 Mbps – but will still feel slow.
Latency can be an issue with all Internet connections and networks. Wired network connections tend to have the lowest latency, while wireless connections generally have higher latency.
Image Credit: Timo Newton-Syms on Flickr
Latency vs. Bandwidth
Internet connections, including satellite Internet connections, are advertised with speeds like “up to 15 Mbps.” You may look at a satellite Internet connection offering this speed and assume the experience of using it would be comparable to the experience of using a 15 Mbps cable Internet connection, but you would be wrong.
Bandwidth: Bandwidth determines how fast data can be transferred over time. Bandwidth is the amount of data that can be transferred per second. Latency: Latency is delay. Latency is how long it takes data to travel between its source and destination, measured in milliseconds.
Latency in the Real World
Let’s say you are browsing the web on different types of connections. Here’s how latency would “feel”:
Satellite Internet Connection (High Speed, High Latency): You would click a link on a web page and, after a noticeable delay, the web page would start downloading and show up almost all at once. Theoretical Connection (Low Speed, Low Latency): You would click a link on a web page and the web page would start loading immediately. However, it would take a while to load completely and you would see images load one-by-one. Cable Internet Connection (High Speed, Low Latency): You would click a link on a web page and the web page would appear almost immediately, downloading all at once.
Latency always manifests as a delay. For example, if you are having a Skype chat with someone on a high-latency Internet connection, you would be out of sync with each other. You would have to pause in between sentences or you would end up talking over each other thanks to the delay.
If you were playing an online game, your actions would be delayed and events happening in the game would have a noticeable delay before they reached your computer, rather than feeling near-instantaneous. For example, if you were playing a first-person shooter game on a high-latency connection, you would shoot at someone on your screen, but the delay means they would be long gone by the time your projectile got there.
Image Credit: MLibrary on Flickr
What Causes Latency
Both bandwidth and latency depend on more than your Internet connection – they are affected by your network hardware, the remote server’s location and connection, and the Internet routers between your computer and the server.
Packets don’t travel through routers instantly. Each router a packet has to travel through introduces a delay of a few milliseconds, which can add up if the packet has to travel through many routers to reach the other side of the world.
However, some types of connections – like satellite Internet connections – have high latency even in the best conditions. It generally takes between 500 and 700ms for a packet to reach an Internet service provider over a satellite Internet connection.
Latency isn’t just a problem for satellite Internet connections, however. You can probably browse a website hosted on another continent without noticing latency very much, but if you are in California and playing an online game with servers located in Europe, the latency may be more perceptible.
Measuring Latency
You can measure the latency between your computer and a web address with the ping command. In our example, it takes 11 milliseconds for traffic to go between our computer and Google’s servers. If we had a satellite Internet connection, this could be as high as 700ms.
To show the impact of distance on latency, we can ping Baidu – a Chinese search engine. Baidu doesn’t have any servers in North America, so our computer has to communicate with its servers in China. The latency between our computer and Baidu’s servers is 228ms.
When we ping our local router, we see a latency of 1ms. Our router is close and we can connect directly without going through other routers.
You can see how much latency each router – or “hop” – is adding with the traceroute command.
Latency is always with us; it’s just a matter of how significant it is. At low latencies, data should transfer almost instantaneously and we shouldn’t be able to notice a delay. As latencies increase, we begin to notice more of a delay.