The low latency of live video streaming, such as second screen use, live reporting, and online video games, is vital in ensuring the best possible user experience.
Here's a big secret: when it comes to media, "live" rarely really means "live". Let's say you're at home watching a live show and seeing an audience member jump onstage. The audience at the venue saw it happen at least 30 seconds before you did.
This is because it takes time to move chunks of data (pieces of information used in numerous multimedia formats) from one place to another. This delay between a camera capturing video and displaying video is called LATENCY.
So, what is low latency if several seconds of latency is considered normal? It's a subjective term. By default, the latency of the famous Apple HLS streaming protocol is 30-45 seconds.
When people talk about low latency, they often talk about getting it down to single-digits. However, the term low latency also encompasses what is often called real-time streaming (we're talking milliseconds here).
Also read: What are streaming protocols; How do they work?
No one wants noticeably high latency, of course, but in what contexts does low latency really matter?
The typical 30-45 second delay is manageable for most streaming scenarios. Returning to our concert example, it's irrelevant if the guitarist broke a string 36 seconds ago and you just found out.
But for some streaming use cases, latency is a critical business consideration. For instance, Amazon found that users' purchases dropped by 1% for every additional 100 milliseconds of waiting.
Similarly, According to Google's calculations, they could lose 8 million daily searches if they slowed down their search results by just four-tenths of a second.
Let's look at some streaming of them where low latency is undeniably essential.
According to the concept of the second screen, we can understand it as the simultaneous consumption of television and the Internet. Example: watching TV programs or commercials and, at the same time, using smartphone or tablet apps to interact with the content (opinions, polls, etc.).
If you're watching an event on TV and a second-screen app, you can tell at a glance if there's a latency issue, which will cause discomfort.
Imagine that a sports channel offers a second-screen application so that you can see alternate camera angles and exchange comments with other users. The game's winning score is shown on the TV but isn't transmitted to the app until a minute later. The time for exchanging game feedback in the app has passed.
However, the sweet spot here isn't the ultra-low "real-time" latency we'll discuss next. This is because there is also latency for the television broadcast. If you're watching on digital cable, as most families do, the transmission latency can be up to six seconds. Your second screen app only needs to match this level of latency to deliver a fantastic experience in sync with your TV content.
This is where ultra-low latency live streaming comes into play. We've all seen televised interviews where the reporter is talking to someone at a remote location. The latency in the exchange of messages results in long pauses, sometimes with the two parties talking over each other.
This is because latency acts on both – it may take a second for the reporter's question to reach the respondent and another second for the respondent's response to return to the reporter.
This conversation can quickly become uncomfortable. When prompt responses are important, the acceptable limit is about 150 milliseconds of latency in each direction. This time frame is short enough for smooth conversation without awkward pauses.
Activities like auctions and sports betting are exciting because of their fast pace. And that speed requires real-time streaming. For example, horse racing tracks have traditionally been shared via satellite with other tracks around the world and allow their viewers to place bets online.
Satellite delays can be costly. Ultra-low latency streaming eliminates these troublesome delays and reduces dropouts. Likewise, online auctions are big business, and any delay could mean that bids need to be recorded correctly. Fractions of seconds make all the difference.
Anyone who has ever screamed, “This game is stealing!” in front of a screen knows that time is critical for players. A latency of less than 100 milliseconds is mandatory. No one wants to use a streaming service to ultimately find they're shooting enemies that aren't there anymore.
Now that you know what low latency is and when it matters, you're probably wondering: how do you provide low latency streaming? As with most things in life, low-latency streaming involves trade-offs.
You will have to balance three factors to find the right mix:
The streaming protocol you choose makes a big difference. Let's analyze this:
Apple HLS is among the most widely used streaming protocols due to its reliability, but it is unsuitable for true low-latency streaming. This is because HLS is an HTTP-based protocol that transmits chunks of data. In simple terms, every video file is converted into more petite video “chunks” for the video to buffer for a smooth playback adequately.
This means that at least 6 seconds of the chunk must be generated, encoded, transmitted, decoded, and buffered on the viewer’s video player. So, there will be a latency of “at least 6 seconds” in this case. Since each video piece must be viewed in real-time, chunk size plays a vital role in latency.
The original configuration size of the Apple HLS data chunk is 10 seconds, leading to a latency of up to 45 seconds. Customization can reduce this significantly, but more is needed for an ultra-low latency scenario. Exacerbating the problem, your viewers will experience more buffering the smaller you make these chunks (as enough video will not get buffered on the device).
RTMP and WebRTC are the standards for low-latency streaming.
Other important considerations are your streaming server. You'll need a streaming technology that gives you fine-grained control over latency and video quality and gives you the most flexibility possible.
ESPN is a prime example of a business that has embraced low-latency streaming to improve customer engagement. With the rise of online streaming services, traditional cable TV providers like ESPN have had to adapt to keep up with changing consumer habits.
To stay competitive, ESPN introduced low latency streaming for their live sports events, allowing viewers to watch games in real-time with minimal delay. This provided a more immersive and engaging experience for viewers, who could interact with the game and each other in real-time.
The result was a significant increase in engagement and viewership for ESPN. By providing a seamless streaming experience, ESPN was able to keep its customers engaged and interested in its content, ultimately driving revenue and growth for the business.
Zoom is another business that has embraced low-latency streaming to enhance its customer support services. With the rise of remote work, video conferencing has become essential for businesses to communicate with their employees and customers.
However, traditional video conferencing technologies can have significant latency times, resulting in delays and poor-quality calls. To address this issue, Zoom introduced low latency streaming for their video conferencing services, providing users with a more seamless and immersive experience.
The result was a significant increase in user adoption and satisfaction with Zoom. By providing a real-time and high-quality video conferencing experience, Zoom increased productivity and reduced the need for in-person meetings, ultimately driving growth for the business.
Amazon Prime Video is a business that integrated low latency streaming to improve customer engagement and retention. With the rise of streaming services like Netflix and Hulu, Amazon Prime Video has had to compete to retain its customers and attract new ones.
Amazon Prime Video introduced low latency streaming for their live streaming services to stay competitive, allowing viewers to watch events in real-time with minimal delay. This provided a more immersive and engaging experience for viewers, who could interact with the event and each other in real-time.
The result was a significant increase in engagement and retention for Amazon Prime Video. By providing a seamless and immersive streaming experience, Amazon Prime Video kept its customers engaged and interested in its content, ultimately driving revenue and growth for the business.
Even in the cloud computing era, server latency still needs to improve for many companies. So that this issue does not take on more significant proportions and cause damage to the business, it is worth paying attention to some tips.
To send data — be it images, music, videos, or documents — you need to have a good communication infrastructure. In this case, the internet.
This network's infrastructure comprises devices and constraints such as routers, cables, and available bandwidth. Considering that this verification can be a complicated task for many people, it is always recommended to have the help of a qualified person in the IT area.
In addition to finding out where the system latency is, it is essential to know its type, as this problem can have several causes.
The issue can be resolved quickly, but in other scenarios, the presence of a specialized person is essential for identifying the problem that the connection faces.
A server typically slows down when many users make requests simultaneously. A common situation involving this problem is that of an e-commerce company in times like Black Friday.
So that there are no slowdowns or crashes in the system during the increase in simultaneous accesses, the ideal solution is to have a service that offers automatic scalability, having its bandwidth and performance improved in case of a sudden increase in demand.
The use of CDN technology is an excellent ally in reducing the latency of web applications.
Through it, it is possible to save copies of the data to be distributed based on geographic location, connecting users to the closest possible server and increasing the speed of data transfer.
Thus, it is understood that latency is inevitable. However, through the correct techniques and tools, it is possible to reduce this factor in online services and increase the quality of customer service.
In practical terms, latency is the time between a user accessing a file or service and the server's response. This latency can be very long when the server is far from the user or the network is congested.
A Content Delivery Network reduces the distance by triggering the closest server, providing greater network bandwidth. Both factors combine to reduce latency significantly.
Also read: What is a Video CDN?
Overall, a CDN yields a better viewing experience for the user, with snappy effects, fast loading, and little delay between clicking and getting results.
Teyuto provides a seamless streaming solution with low latency streaming with HLS encryption, signed URL, and DRM delivery. It offers the following features:
Teyuto is an excellent streaming service that provides powerful and cutting-edge features. Achieving ultra-low latency and satisfying viewers is simple with Teyuto's rich platform, which includes industry-leading HLS for robust and efficient streaming.
Live OTT delivery is gaining in popularity and usage. Among them, an increasing number of major media companies are adding live streaming to their service menu to differentiate themselves from their competitors in the OTT field. Media distributors can differentiate their OTT platforms by offering low-latency, high-quality video.
With so many low-latency streaming options, there is no one-size-fits-all solution for low-latency video delivery in every workflow. The best solution depends on the type of content you stream and the demands of your video workflow. To know more, book a consultation with our experts and get a personalized demo today.
Enjoyed this read?