Low Latency Streaming For Business: 3 Case Studies

Low latency streaming is the process of transmitting live audio, video, or other data over the internet with minimal delay. It's crucial for interactive applications, such as gaming, video conferencing, and live broadcasting, where real-time interaction is key.
October 15, 2023
-
Minutes Read

The low latency of live video streaming, such as second screen use, live reporting, and online video games, is vital in ensuring the best possible user experience.

Here's a big secret: when it comes to media, "live" rarely really means "live". Let's say you're at home watching a live show and seeing an audience member jump onstage. The audience at the venue saw it happen at least 30 seconds before you did.

This is because it takes time to move chunks of data (pieces of information used in numerous multimedia formats) from one place to another. This delay between a camera capturing video and displaying video is called LATENCY.

What is low latency?

So, what is low latency if several seconds of latency is considered normal? It's a subjective term. By default, the latency of the famous Apple HLS streaming protocol is 30-45 seconds.

When people talk about low latency, they often talk about getting it down to single-digits. However, the term low latency also encompasses what is often called real-time streaming (we're talking milliseconds here).

Also read: What are streaming protocols; How do they work?

When is low latency important?

No one wants noticeably high latency, of course, but in what contexts does low latency really matter?

The typical 30-45 second delay is manageable for most streaming scenarios. Returning to our concert example, it's irrelevant if the guitarist broke a string 36 seconds ago and you just found out.

But for some streaming use cases, latency is a critical business consideration. For instance, Amazon found that users' purchases dropped by 1% for every additional 100 milliseconds of waiting.  

Similarly, According to Google's calculations, they could lose 8 million daily searches if they slowed down their search results by just four-tenths of a second.

Let's look at some streaming of them where low latency is undeniably essential.

Second screen Experiences

According to the concept of the second screen, we can understand it as the simultaneous consumption of television and the Internet. Example: watching TV programs or commercials and, at the same time, using smartphone or tablet apps to interact with the content (opinions, polls, etc.).

If you're watching an event on TV and a second-screen app, you can tell at a glance if there's a latency issue, which will cause discomfort.

Imagine that a sports channel offers a second-screen application so that you can see alternate camera angles and exchange comments with other users. The game's winning score is shown on the TV but isn't transmitted to the app until a minute later. The time for exchanging game feedback in the app has passed.

However, the sweet spot here isn't the ultra-low "real-time" latency we'll discuss next. This is because there is also latency for the television broadcast. If you're watching on digital cable, as most families do, the transmission latency can be up to six seconds. Your second screen app only needs to match this level of latency to deliver a fantastic experience in sync with your TV content.

Video Chat

This is where ultra-low latency live streaming comes into play. We've all seen televised interviews where the reporter is talking to someone at a remote location. The latency in the exchange of messages results in long pauses, sometimes with the two parties talking over each other.

video chat

This is because latency acts on both – it may take a second for the reporter's question to reach the respondent and another second for the respondent's response to return to the reporter.

This conversation can quickly become uncomfortable. When prompt responses are important, the acceptable limit is about 150 milliseconds of latency in each direction. This time frame is short enough for smooth conversation without awkward pauses.

Bets and Bids

Activities like auctions and sports betting are exciting because of their fast pace. And that speed requires real-time streaming. For example, horse racing tracks have traditionally been shared via satellite with other tracks around the world and allow their viewers to place bets online.

Satellite delays can be costly. Ultra-low latency streaming eliminates these troublesome delays and reduces dropouts. Likewise, online auctions are big business, and any delay could mean that bids need to be recorded correctly. Fractions of seconds make all the difference.

Online Video Game

Online video game

Anyone who has ever screamed, “This game is stealing!” in front of a screen knows that time is critical for players. A latency of less than 100 milliseconds is mandatory. No one wants to use a streaming service to ultimately find they're shooting enemies that aren't there anymore.

How does low-latency streaming work?

Now that you know what low latency is and when it matters, you're probably wondering: how do you provide low latency streaming? As with most things in life, low-latency streaming involves trade-offs.

You will have to balance three factors to find the right mix:

  • Encoding protocol and compatibility between device and player
  • Audience size and geographic distribution
  • Video resolution and complexity

The streaming protocol you choose makes a big difference. Let's analyze this:

Apple HLS is among the most widely used streaming protocols due to its reliability, but it is unsuitable for true low-latency streaming. This is because HLS is an HTTP-based protocol that transmits chunks of data. In simple terms, every video file is converted into more petite video “chunks” for the video to buffer for a smooth playback adequately.

low latency streaming

This means that at least 6 seconds of the chunk must be generated, encoded, transmitted, decoded, and buffered on the viewer’s video player. So, there will be a latency of “at least 6 seconds” in this case. Since each video piece must be viewed in real-time, chunk size plays a vital role in latency.

The original configuration size of the Apple HLS data chunk is 10 seconds, leading to a latency of up to 45 seconds. Customization can reduce this significantly, but more is needed for an ultra-low latency scenario. Exacerbating the problem, your viewers will experience more buffering the smaller you make these chunks (as enough video will not get buffered on the device).

RTMP and WebRTC are the standards for low-latency streaming.

  • RTMP offers good low-latency streaming but requires a Flash-based player – a format no longer supported by web browsers.
  • WebRTC is the standard deployed on many platforms, allowing for low-latency delivery in an HTML5-based non-Flash environment. WebRTC, however, is a lossy protocol and tends to lose data, which impacts quality.

Other important considerations are your streaming server. You'll need a streaming technology that gives you fine-grained control over latency and video quality and gives you the most flexibility possible.

Case Studies on Low Latency Streaming for Business

Case Study 1: ESPN

ESPN is a prime example of a business that has embraced low-latency streaming to improve customer engagement. With the rise of online streaming services, traditional cable TV providers like ESPN have had to adapt to keep up with changing consumer habits.

To stay competitive, ESPN introduced low latency streaming for their live sports events, allowing viewers to watch games in real-time with minimal delay. This provided a more immersive and engaging experience for viewers, who could interact with the game and each other in real-time.

The result was a significant increase in engagement and viewership for ESPN. By providing a seamless streaming experience, ESPN was able to keep its customers engaged and interested in its content, ultimately driving revenue and growth for the business.

H2: Case Study 2: Zoom

Zoom is another business that has embraced low-latency streaming to enhance its customer support services. With the rise of remote work, video conferencing has become essential for businesses to communicate with their employees and customers.

However, traditional video conferencing technologies can have significant latency times, resulting in delays and poor-quality calls. To address this issue, Zoom introduced low latency streaming for their video conferencing services, providing users with a more seamless and immersive experience.

The result was a significant increase in user adoption and satisfaction with Zoom. By providing a real-time and high-quality video conferencing experience, Zoom increased productivity and reduced the need for in-person meetings, ultimately driving growth for the business.

H2: Case Study 3: Amazon Prime Video

Amazon Prime Video is a business that integrated low latency streaming to improve customer engagement and retention. With the rise of streaming services like Netflix and Hulu, Amazon Prime Video has had to compete to retain its customers and attract new ones.

Amazon Prime Video introduced low latency streaming for their live streaming services to stay competitive, allowing viewers to watch events in real-time with minimal delay. This provided a more immersive and engaging experience for viewers, who could interact with the event and each other in real-time.

The result was a significant increase in engagement and retention for Amazon Prime Video. By providing a seamless and immersive streaming experience, Amazon Prime Video kept its customers engaged and interested in its content, ultimately driving revenue and growth for the business.

How to reduce latency? 

Even in the cloud computing era, server latency still needs to improve for many companies. So that this issue does not take on more significant proportions and cause damage to the business, it is worth paying attention to some tips.

case study amazon prime

1. Review the communication infrastructure

To send data — be it images, music, videos, or documents — you need to have a good communication infrastructure. In this case, the internet.

This network's infrastructure comprises devices and constraints such as routers, cables, and available bandwidth. Considering that this verification can be a complicated task for many people, it is always recommended to have the help of a qualified person in the IT area.

2. Know the type of latency that is disturbing the connection

In addition to finding out where the system latency is, it is essential to know its type, as this problem can have several causes.

The issue can be resolved quickly, but in other scenarios, the presence of a specialized person is essential for identifying the problem that the connection faces.

3. Count on automatic scaling

A server typically slows down when many users make requests simultaneously. A common situation involving this problem is that of an e-commerce company in times like Black Friday.

So that there are no slowdowns or crashes in the system during the increase in simultaneous accesses, the ideal solution is to have a service that offers automatic scalability, having its bandwidth and performance improved in case of a sudden increase in demand.

4. Implement distribution networks

The use of CDN technology is an excellent ally in reducing the latency of web applications.

Through it, it is possible to save copies of the data to be distributed based on geographic location, connecting users to the closest possible server and increasing the speed of data transfer.

Thus, it is understood that latency is inevitable. However, through the correct techniques and tools, it is possible to reduce this factor in online services and increase the quality of customer service.

Using a CDN and Reducing Latency

In practical terms, latency is the time between a user accessing a file or service and the server's response. This latency can be very long when the server is far from the user or the network is congested.

A Content Delivery Network reduces the distance by triggering the closest server, providing greater network bandwidth. Both factors combine to reduce latency significantly.

Also read: What is a Video CDN?

Overall, a CDN yields a better viewing experience for the user, with snappy effects, fast loading, and little delay between clicking and getting results.

Teyuto Offers Live Streaming Solutions Without Delay

Teyuto provides a seamless streaming solution with low latency streaming with HLS encryption, signed URL, and DRM delivery. It offers the following features:

  • White-label Streaming
  • Monetization
  • Optimum Security Features
  • Video API and Multi Bitrate Streaming 

Teyuto is an excellent streaming service that provides powerful and cutting-edge features. Achieving ultra-low latency and satisfying viewers is simple with Teyuto's rich platform, which includes industry-leading HLS for robust and efficient streaming.

Summary

Live OTT delivery is gaining in popularity and usage. Among them, an increasing number of major media companies are adding live streaming to their service menu to differentiate themselves from their competitors in the OTT field. Media distributors can differentiate their OTT platforms by offering low-latency, high-quality video.

With so many low-latency streaming options, there is no one-size-fits-all solution for low-latency video delivery in every workflow. The best solution depends on the type of content you stream and the demands of your video workflow. To know more, book a consultation with our experts and get a personalized demo today.

Build your video empire

Your outstanding video channel in one place: Video CMS, Community, Marketing & Analytics.

Enjoyed this read?

Stay up to date with the latest video business news, strategies, and insights sent straight to your inbox!
Marcello Violini
Table of Contents
Share this post

Build your video empire

Your outstanding video channel in one place: Video CMS, Community, Marketing & Analytics.
Free training & 24-hour support
99.9% uptime the last 12 months
Serious about security & privacy
Video Distribution Platform & Monetization
Contact Us

FEATURES

COMPANY

Terms and conditions Privacy Policy  Cookie Policy