Network performance: Bandwidth and latency
1. Network performance: Bandwidth and latency
Now that you’ve been introduced to some of the fundamentals of networking, let’s explore how networks perform and are measured. Two important terms in networking are bandwidth and latency. Let’s define them both. Bandwidth is a measure of how much data a network can transfer in a given amount of time. This rate of data transfer is typically measured in terms of megabits per second (Mbps) or gigabits per second (Gbps). Generally speaking, a higher bandwidth allows a computer to download information from the internet more quickly. One way to think of bandwidth is to picture water flowing through a pipe. The bandwidth would be the volume of water a pipe can handle flowing through per second. A wider pipe can handle more water. An internet service provider may provide a home internet connection with 100 megabits per second to over 1 gigabit per second; a data center may have bandwidth from 10 to 100 gigabits per second. Having high bandwidth is useful when sending a large amount of data per second, such as streaming high-definition video, but it’s not the only important measure of network performance. For users playing real-time multiplayer games online, latency will matter much more. Network latency is the amount of time it takes for data to travel from one point to another. Often measured in milliseconds, latency—sometimes called lag—describes delays in communication over a network. Going back to our flowing water analogy, latency is the delay from the moment the water pipe is opened until water starts flowing through. Ideally, latency should be as close to zero as possible. However, because it’s a result of the physical distance that data must travel—through wires, fiber optics, routers, and more—to reach its destination, each hop along the way adds a small amount of latency to the communication. No matter how much data you can send and receive at once, it can only travel as fast as network latency allows. Imagine an image file took just 10 milliseconds to download with a high-bandwidth connection, but a user had to wait 100 milliseconds before receiving the first byte of data. In this case, the latency—how much time it took for data packets to travel from one point to another in the network—accounted for most of the time. Cloud computing and mobile technologies have made it easier for developers to reach global audiences, but high latency can drag down an application's performance. Websites run slower for some users depending on their physical location, even if both the user and the server have excellent bandwidth. So the farther a user is from a server, or the more fragmented the network is, the greater the latency. Reducing latency is essential to reaching users faster.2. Let's practice!
Create Your Free Account
or
By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.