What Is Latency (Ping)?
Latency — often called ping — is the time it takes for a data packet to travel from your device to a server and back. It's measured in milliseconds (ms) and represents the fundamental speed limit of your internet connection for interactive tasks.
Bandwidth (Mbps) tells you how much data your connection can carry. Latency tells you how quickly it responds. You can have a 1 Gbps connection with 200ms latency — plenty of bandwidth to download files quickly, but painfully slow for gaming or video calls because every action takes a fifth of a second to register.
When gamers talk about "ping," they're talking about latency. When someone says their VoIP call has "delay," they're talking about latency. It's the most intuitively felt network metric — you literally experience it as the gap between doing something and seeing the result.
What's a Good Latency (Ping)?
| Latency | Quality | Experience |
|---|---|---|
| < 20ms | Excellent | Near-instant response. Ideal for competitive gaming, real-time trading, live production. |
| 20–50ms | Good | Responsive. Great for all gaming, video calls, VoIP. Most users won't notice any delay. |
| 50–100ms | Fair | Playable for gaming, fine for video calls. Fast-twitch competitive games (Valorant, CS2) become harder. |
| 100–150ms | Noticeable | Visible input lag in games. VoIP calls have a slight but noticeable conversation delay. |
| 150ms+ | High | Obvious delay. Gaming is sluggish. Voice calls have awkward pauses. Real-time collaboration suffers. |
Latency by Application
- Competitive gaming (FPS, fighting games): Under 30ms ideal, under 60ms acceptable
- Casual gaming (MMO, strategy): Under 100ms is fine
- Video calls: Under 150ms for natural conversation
- VoIP: Under 150ms (ITU recommendation)
- Web browsing: Under 100ms feels instant, but latency has minimal impact on page loads compared to bandwidth
What Causes High Latency?
Physical Distance
Light in a fiber optic cable travels at about 200,000 km/s. A round trip from New York to London (11,000 km) takes at minimum 55ms — and that's just the speed of light, before any processing. Connecting to a server on another continent will always have higher latency than a local one. This is why game servers are regional and why PacketProbe offers multiple test server locations.
Network Hops
Every router between you and the destination adds processing time — typically 1-5ms per hop. A typical internet connection traverses 10-20 hops, each adding a small delay. Inefficient routing (where packets take a longer path than necessary) compounds this.
Network Congestion
When routers are busy, packets queue up and wait to be processed. During peak hours, this queuing delay can add tens or even hundreds of milliseconds. This is why your ping might be great at 3 AM but terrible at 8 PM.
Wi-Fi Overhead
Wireless connections add latency through the Wi-Fi protocol itself — channel contention, retransmissions, and processing. A typical Wi-Fi connection adds 2-10ms of latency compared to Ethernet, and more under congestion or interference.
Your ISP's Network
Some ISPs route traffic inefficiently, use overloaded peering points, or add latency through traffic management systems. The "first hop" latency from your modem to your ISP's network is often the most impactful and the hardest to control.
VPN or Proxy
VPNs route your traffic through an intermediate server, adding at least one extra hop and potentially sending your data on a longer physical path. This typically adds 10-50ms of latency depending on the VPN server location.
How to Test Latency
The most common way to test latency is the ping command in your terminal, which sends ICMP echo packets and measures round-trip time. However, ICMP-based tests have limitations:
- Some routers deprioritize ICMP traffic, giving artificially high results
- ICMP doesn't reflect how your actual application traffic is handled
- It only measures latency — not packet loss or jitter simultaneously
PacketProbe measures latency using WebRTC data channels — the same protocol that actual web applications use for real-time communication. This gives you latency measurements that accurately reflect what your browser-based games, video calls, and VoIP apps actually experience.
The test also simultaneously measures packet loss and jitter, giving you the complete picture of your connection quality in a single test.
How to Reduce Latency
1. Connect via Ethernet
Eliminate the 2-10ms Wi-Fi overhead and the variability that comes with it. This is the easiest and most reliable improvement.
2. Choose Closer Servers
In games, select the server region closest to you. For VoIP, choose a provider with local points of presence. For PacketProbe, use the Auto (nearest) option to test against the closest server.
3. Reduce Network Congestion
Pause downloads, limit streaming on other devices, and close bandwidth-heavy applications. Every bit of congestion on your local network adds queuing delay.
4. Enable QoS or SQM
Quality of Service (QoS) settings prioritize latency-sensitive traffic. Smart Queue Management (SQM) — available on modern routers and OpenWrt firmware — is even better, actively managing buffer sizes to minimize queuing delay under all conditions.
5. Restart Your Router
Routers can develop routing table bloat and memory issues over time that add latency. A restart clears these. Consider scheduling weekly reboots.
6. Disable VPN for Gaming
If you're using a VPN and experiencing high ping, try disconnecting it. The extra routing hop adds latency. Some gaming VPNs (like ExitLag or WTFast) claim to reduce latency by optimizing routing, but results vary.
7. Switch ISPs or Plans
If your first-hop latency (to your ISP's gateway) is consistently high, the problem is in their network. Fiber connections typically have the lowest latency, followed by cable, then DSL. Satellite internet has inherently high latency (500ms+) due to the distance to orbit.
8. Use a Gaming Router
Gaming routers with features like geo-filtering (connecting only to nearby servers) and traffic prioritization can meaningfully reduce latency for gaming. Look for routers with SQM or fq_codel support.
Latency vs. Bandwidth
These are the two most confused networking concepts:
- Bandwidth (measured in Mbps) = how much data can flow at once. Think of it as the width of a pipe.
- Latency (measured in ms) = how quickly data starts flowing. Think of it as the length of the pipe.
A wide, long pipe (high bandwidth, high latency) downloads large files quickly but feels sluggish for interactive tasks. A narrow, short pipe (low bandwidth, low latency) feels responsive but can't handle heavy streaming.
For real-time applications like gaming and VoIP, latency matters far more than bandwidth. A 10 Mbps connection with 20ms ping will feel better for gaming than a 1 Gbps connection with 150ms ping. Speed tests focus on bandwidth; PacketProbe focuses on what actually determines your real-time experience.
Latency vs. Jitter vs. Packet Loss
- Latency — How long a round trip takes (the baseline delay)
- Jitter — How much that delay varies (consistency of the connection)
- Packet loss — When data never arrives at all (reliability of the connection)
All three contribute to what users perceive as "lag." High latency feels like playing in slow motion. High jitter feels like stuttering. Packet loss feels like things skipping or disappearing. A truly good connection has low values in all three — and the only way to know is to test them.