What Really Makes your Internet Fast ?
When internet service providers advertise their services and network speed, they often boast about blazing-fast speeds—usually measured in megabits or gigabits per second. This figure, known as bandwidth, is a key selling point and is commonly equated with how “fast” your internet connection is. But there’s another, less publicized factor that can have an even greater impact on your online experience: latency.
Whether you’re a home user streaming movies or a business relying on cloud-based tools and video conferencing, understanding the key differences between bandwidth and latency is essential to determining internet speed and ensuring you are getting the most from your network’s performance.
What Is Bandwidth?
Bandwidth is the maximum amount of data that can be transferred over a network connection or internet connection in a given amount of time. It’s typically measured in megabits per second (Mbps) or gigabits per second (Gbps). The higher the bandwidth, the more data can flow through your connection at any one time.
I typically use the analogy of a road or highway, in this analogy bandwidth is the number of lanes on the highway. The more lanes a highway has the more cars that can flow parallel to each other.
What about the term “Shared Bandwidth” or Broadband ?
Many Internet services that are sold to the masses are broad networks or “Broadband Networks” in Australia this normally refers to the National Broadband Network (NBN) or Mobile, 5G and 4G networks. These networks operate by connecting many computers or sites to a single network. Although these networks may have a backbone (underlying network infrastructure, typically fibre) of Gigabits, this medium and bandwidth is shared by all users. This can create issues with many broadband products when “peak times” are experienced. At these times of the day, typically 6pm-11pm for residential services, the shared bandwidth becomes that saturated that latency is significantly increased creating lag or slowness on all services.
What is Latency?
Latency is the time it takes for data to travel from your device to its destination (like a website or server) and back again. It’s typically measured in milliseconds (ms) and is often referred to as a ping or round-trip time in networking contexts.
In simple terms, latency is the delay between when you request something online—like clicking a link or starting a video call—and when you start to receive a response or data from the other side. This is what causes the lag on a phone call, video call or whilst gaming.
Using our road analogy,network latency is the speed limit on the road. It is how fast traffic or in this case data can travel along the road or medium.
The delay for data travel is influenced by several key factors:
First, every network device that your data passes through—such as your home router, your internet service provider’s equipment, and various network switches and servers—adds a small amount of processing time. These are known as hop delays, and while each one may only add a few milliseconds, they can accumulate quickly.
The most noticeable contributor to latency, however, is physical distance. Data must physically travel across the globe, and while it moves at nearly the speed of light through fibre-optic cables, even light takes time to cover vast distances. For example, sending data from Australia to the United States and back typically results in a latency of over 150 milliseconds. If you’re using a satellite connection, where data must travel to space and back, latency can exceed 500 milliseconds (0.5 seconds)—enough to cause noticeable delays in video calls or online gaming.
In Australia, another factor is our geographic spread. Our capital cities are separated by large distances, which can introduce significant latency even within the country. For example, a connection from Sydney to Perth may experience higher latency than one from Sydney to Melbourne due to the sheer distance involved.
To better understand how this affects your experience, check out our Latency to Australian Cities Chart, which illustrates the typical delays between our cities.
Is higher Bandwidth Better or Lower Latency?
Well that depends…
It primarily depends on what you are using the connection for. For real-time applications latency is extremally important and often run just fin on low bandwidth connections. Whereas for large amounts of data such as file transfers, high resolution video and data backups we need high network bandwidths where the latency is of little concern.
When do I need low latency ?
Latency is a critical factor in real-time applications. For services like:
- Phone Calls – VOIP Calls
- Video Calls – Video Conferencing
- Online Gaming
- Working From Home
- Cloud Computing
We aim to keep latency as low as possible—ideally under 30 to 50 milliseconds. Staying within this data transmission range helps ensure smooth, responsive interactions without noticeable lag or delay.
To support this, most real-time applications are designed with efficiency in mind. They use specialized protocols and techniques to minimize the amount of data transmitted, limiting network traffic and creating a low throughput bandwidth scenario. This is why bandwidth, while important, is often not the limiting factor in these scenarios.
For example, a typical VoIP (Voice over IP) call may only require 8 kbps for acceptable quality, and up to 64 kbps for high-quality audio comparable to traditional PSTN (Public Switched Telephone Network) calls. These low data requirements allow real-time applications to perform well even on modest internet connections—provided latency is kept in check.
When do I need high bandwidth ?
A High Bandwidth is required for the transfer of data or for many simultaneous transfers of data. For example:
- Data transfers from one location to another
- Backups of significant data (many Gigabytes – Terra Bytes)
- Streaming of 4K video
- Streaming to multiple devices
in these environments, when planning for a reliable internet connection—it’s important to calculate how much bandwidth is actually needed.
First, remember that bandwidth is measured in bits per second (bps), while most of us are more familiar with bytes when dealing with file sizes. Since 1 byte = 8 bits, you can estimate how many bytes you can transfer per second by dividing the bandwidth by 8.
Bandwidth Example:
A 100 Mbps (megabits per second) connection can theoretically have a data transfer rate of:
- 100 ÷ 8 = 12.5 megabytes per second (MB/s)
However, in real-world conditions, you won’t get the full 12.5 MB/s due to network overhead, protocol headers, packet loss and signal noise. A more realistic estimate might be around 10 MB/s.
From there, you can scale up:
- 10 MB/s × 60 seconds = 600 MB per minute
- 600 MB × 60 minutes = 36,000 MB (or 36 GB) per hour
This gives you a rough idea of how much data can be moved over your connection in a given time frame.
Now, if you have multiple devices or applications running simultaneously—like video calls, streaming, cloud backups, or large file transfers—you’ll need to multiply the simultaneous bandwidth needs of each to determine the total bandwidth required. This helps you size your “data pipe” appropriately so performance doesn’t suffer during peak usage.
A Table of Bandwidth Requirements for the Internet
Here’s a table showing the typical bandwidth requirements for various internet activities. These values are approximate and can vary depending on the specific application, quality settings, and network conditions:
Activity | Bandwidth Requirements |
---|---|
Phone Calls and VOIP | 6Kbps – 64Kbps |
Video Calls and Video Conferencing | 500Kbps – 3Mbps – Dependant on Video Quality |
Teams Calls | 30Kbps – 1.5Mbps – Dependant on Video Quality |
Zoom Calls | 60Kbps – 1.8Mbps – Dependant on Video Quality |
Remote Desktop – RDP – Windows 365 | 500Kbps – 2Mbps |
Standard Video Streaming – Live Streaming | 1.5Mbps – 4Mbps |
4K Video Streaming | 15Mbps – 25Mbps |
You Tube | 500Kbps – 4Mbps |
Cloud Computing | 1Mbps – 10Mbps |
Internet Browsing – Web Pages | 200Kbps -1Mbps |
20Kbps – 50Kbps |
What is Lag ? Why is my Internet Connection Slow ?
Under normal circumstances, if you’re using less network bandwidth than your maximum available bandwidth, latency remains low and stable. This means your connection has sufficient bandwidth and can handle the flow of data efficiently, and real-time applications like video calls, gaming, or streaming will perform smoothly.
However, problems arise when the demand for data exceeds the available bandwidth—whether that’s on your home network, your broadband provider’s network infrastructure, or somewhere on the Internet along the path to the destination server. When a network experiences a high throughput, a bottleneck form and we get network congestion. This causes the data to be delayed, thereby increasing the latency. Whilst we have high latency the user experience is degraded and has performance issues for real-time communication.
What is a bottleneck ?
A bottleneck occurs when more data is trying to pass through a connection than it can handle. Just like cars piling up at a toll booth, data packets begin to queue up, waiting for their turn to be transmitted. This queuing introduces delays, which increases latency—the time it takes for data to reach its destination and return.
The result? You may notice:
- Lag in video or voice calls
- Delayed responses in online games
- Buffering during streaming
This delay is commonly referred to as lag, and it’s a direct symptom of congestion caused by insufficient bandwidth.
Our Traffic – Road Analogy
Imagine a highway with four lanes (representing your bandwidth) and a speed limit of 100 km/h (representing your latency). Under normal traffic conditions, cars move freely at full speed. But during a holiday weekend, the highway becomes congested. Even though the speed limit hasn’t changed, the sheer volume of cars slows everything down. This is exactly what happens to your data during a bandwidth bottleneck.
How Can I Reduce Latency
- Use wired connections (Ethernet) instead of Wi-Fi.
- Use Short cable runs make sure the cable connecting your router to the internet is a short as posible.
- Choose servers closer to your location (many apps let you select a region).
- Upgrade your router or firmware.
- Avoid peak usage times when networks are congested.
- Use content delivery networks (CDNs) that cache content closer to users.
What will make my internet faster ?
By understating both bandwidth and latency, you should now be able to determine if and how you can improve your online experience. IF a particular experience os slow is it because –
- The service is on the other side of the world and therefor the latency will be high?
- Do I have a household full of 4K streaming TVs causing network congestion at home?
- Is everything connected to my Wireless Network and my WIFI bandwidth is saturated or using old technology?
- Am I connected to a primarily home based ISP and therefore experience slowness in the evenings whilst trying to work?
- Are my backups running in the middle of the day, saturating my bandwidth and elevating latencies and lag?
From this point, you can enhance your internet experience by either reducing latency or increasing your available bandwidth, depending on your specific needs. If you’re unsure where to start, our team at PIP has been delivering expert advice on key network performance metrics and providing in-depth network analysis for over 30 years . We’re here to help—feel free to reach out anytime.
In addition to consulting, PIP operates its own internet service, offering business-grade NBN solutions tailored for both homes and enterprises. We also provide dedicated dedicated fibre NBN and Business Ethernet connections across Australia, ensuring high-performance connectivity wherever you are.