HLS Latency and Its Impact on Live Broadcasting

HLS Latency and Its Impact on Live Broadcasting

Latency has always been a critical factor in live streaming. It’s the invisible but impactful delay between a live event and its playback for viewers. With the rise of HTTP Live Streaming (HLS) as a widely adopted protocol, broadcasters have had to contend with its inherent latency challenges. This article delves into the issue of HLS latency, its implications for broadcasting, and how it can be managed effectively without compromising the audience’s viewing experience.

Why Latency Matters in Broadcasting

Latency isn’t just a technical detail; it affects how audiences perceive and engage with content. For instance:

  • Live TV Broadcasts: A 10-second delay might not matter much when watching a morning news segment.
  • Sports Broadcasting: Even a few seconds of delay can frustrate fans following live scores.
  • Interactive Content: Real-time interaction, such as Q&A sessions or live auctions, requires ultra-low latency for smooth experiences.

One broadcaster shared an experience of covering a marathon event. While the live stream had a noticeable delay due to HLS, the focus on high-quality visuals and compatibility across devices outweighed the latency issue. The audience valued the clarity and accessibility of the stream more than its timing.

Understanding HLS Latency

HLS operates by breaking video streams into small segments, typically 6 to 10 seconds long, which are then buffered and played back. This segmentation, combined with the need to download and decode multiple chunks, introduces latency. A standard HLS stream can have a delay ranging from 10 to 30 seconds.

This delay is further compounded by:

  1. Network Conditions: Slow or unstable connections can exacerbate latency.
  2. Buffering Requirements: Longer buffer times ensure smooth playback but increase delays.
  3. Encoding and Transcoding: The time taken to process video at the server end adds to the overall delay.

The Role of RTMP in Mitigating Latency

Real-Time Messaging Protocol (RTMP) complements HLS by providing a low-latency solution for ingesting live video feeds. Servers like NGINX or Wowza can use RTMP to capture live streams, which are then converted into HLS for distribution.

For example, a TV station in Europe uses RTMP to collect live feeds from correspondents in the field. These feeds are processed into HLS, enabling broad device compatibility without sacrificing the immediacy required during production.

Comparing HLS and RTMP

FeatureHLSRTMP
LatencyHigh (10-30 seconds)Low (1-5 seconds)
Browser SupportUniversalLimited (no Flash support)
Use CasesPlaybackIngestion
ScalabilityHighModerate

Strategies to Reduce HLS Latency

Reducing HLS latency requires a combination of technical adjustments and workflow optimization:

  1. Low-Latency HLS (LL-HLS): This enhanced version of HLS reduces latency by allowing chunks to be delivered before they are fully encoded.
  2. Smaller Segment Sizes: Reducing chunk size from 6-10 seconds to 2-4 seconds can lower latency significantly.
  3. Optimized Encoding: Faster encoding reduces the time required to prepare video for streaming.
  4. CDN Configuration: Using edge servers closer to the audience minimizes delivery delays.

A broadcaster once reduced their HLS latency from 15 seconds to 5 seconds by adopting LL-HLS and fine-tuning their encoding process, improving viewer satisfaction without compromising video quality.

When HLS Latency is Acceptable

Despite its latency, HLS remains the go-to protocol for many broadcasters because of its reliability and compatibility. Situations where latency is less critical include:

  • Television Broadcasts: Where synchronization with live events is not essential.
  • Corporate Events: Where the focus is on accessibility and quality rather than real-time interaction.
  • Live Performances: Concerts and shows benefit from HLS’s adaptive streaming capabilities without being hindered by slight delays.

Future of HLS and Low-Latency Streaming

Emerging technologies like low-latency HLS and WebRTC aim to close the gap between live ingestion and playback. While WebRTC is gaining traction for ultra-low latency applications, HLS remains relevant for its scalability and device compatibility. Innovations in hybrid workflows combining RTMP, HLS, and WebRTC promise to deliver the best of all worlds.

Conclusion

Latency in HLS can be a challenge, but it doesn’t have to be a dealbreaker. By understanding its limitations and implementing strategies to reduce delays, broadcasters can continue to deliver high-quality streams that meet the needs of diverse audiences. RTMP’s role in enabling low-latency ingestion ensures that HLS remains a practical solution for modern live streaming workflows.


Hashtags: #HLSLatency #LiveStreamingSolutions #RTMPToHLS #BroadcastTech #StreamingChallenges

RTMP Server in the era of HTTP video streaming

Given the growing popularity and support of HTTP video streaming, it may be tempting to consider Real Time Messaging Protocol (RTMP) streaming obsolete. But in many cases, working with RTMP server is still very meaningful. As soon as Macromedia first pronounced RTMP with Flash Player 6 in 2002, the brand new Macromedia Flash Communication Server MX (FCS) required to stream the evolving Flash Video (FLV) format, stream was live or VOD.

However, for FCS, licensing costs were high (up to $5,000 per server), and as a result, CDN costs were high than other proprietary streaming formats. Some manufacturers in the industry have called RTMP streaming costs a “Flash tax”. But Flash Player has been a ubiquitous introduction to desktop browsers for well over a decade, much larger than other plug-ins. Though, streaming server technology has traditionally been harder for web developers to implement, and simpler HTTP delivery has mostly been easier and more cost-effective. In 2003, Macromedia enabled Flash Player 7 to support HTTP delivery of FLV files, allowing integrators to use standard web server technology to deploy online video.

So in many ways, RTMP as a VOD transport delivery has not been a requirement for web video for more than a decade. As a video solution architect, I think of business requirements as the leader in most audio video decisions. And still, in 2014, RTMP is the de facto standard for the following use cases:

Publishing live streams from software/hardware encoders: Almost all streaming devices support RTMP to publish to CDN providers and streaming servers. Some native mobile applications also use RTMP libraries to publish live video from their mobile camera.

Near-instant search/playback: One of the benefits of RTMP streaming is its enhanced search capability. With real-time streaming, the player can search anywhere in the video with less buffering than HTTP delivery. However, to enjoy this feature, you need a Flash Player on your desktop and as such, it is not possible in mobile browsers. HTML5 browsers use HTTP domain requests to facilitate faster searching of VOD files.

Content Protection: RTMP Server can facilitate different levels of content protection, from obfuscation to true DRM. RTMP streams are not stored for playback on a desktop browser.

Adaptive Streaming: One of the preferred uses of RTMP is adaptive streaming playback, where we provide the video player with more bitrates and content resolutions, providing the best resolution at the current network speed. Some HTTP adaptive streaming technologies, such as HLS and MPEG-DASH, allow similar delivery, but RTMP may be more responsive to switching one bitrate to another. Unfortunately, there is no standard for HTTP adaptive streaming between HTML5 video-enabled browsers. As such, the Flash-based display of adaptive streaming is still a requirement for adaptive streaming.

Live Streaming Playback: While live streams do not need to be adaptive, compressed video segments apply the same principle to all live streaming playback. For desktop playback within the browser, virtually every live streaming event requires a video player that supports Flash-based rendering and RTMP playback. Apple Safari on iOS natively supports HLS, and luckily, all modern streaming servers, including the Wowza Streaming Engine, support RTMP and HLS compression.

The requirement for a Live Streaming Flash Player will only change if MPEG-DASH appears everywhere as Flash currently. If Internet Explorer 8, 9 and 10 support is required to install live streaming, then RTMP streaming is still required. Also, if you need a delay close to zero in a live stream, then HTTP streaming will almost certainly not meet your needs while RTMP is capable; HTTP mechanisms require that multiple packets be collected on the server before being sent to the video player.

python hosting

In summary, if your video workflow involves live streaming or any kind of packaged video installation, RTMP is the key to a successful video experience. HTTP delivery has already replaced most Flash VOD installs on video, but HTTP video can’t handle all RTMP video use cases.

Another important setup on Red5 Server is the Python Hosting offered to all hosting customers, with python cpanel hosting the advantage is cheaper and easier to install and run python scripts on a normal hosting. With Python Hosting and rtmp server the customer can really start a webmasters and increase a number of usages to host python scripts or even video chat or live streaming.

As my university professor used to say the computers language for the next 50 years will be python, fast, easier with multiple modules which allow any developer to start an application running python is a few short hours.

Which is better for live streaming, RTMP vs HLS vs WebRTC?

It’s hard to say which one is better, as we’re not comparing apples. Let’s break it up to the strengths and weaknesses of each method.

RTMP used to be the de facto standard for live streaming. Many CDNs offer delivery that scales to the masses. RTMP, however, does not take into consideration that broadcaster and viewers might have internet connections that aren’t always up to the task of transferring the stream at full speed. Smartphones and web browsers are also unable to play back RTMP natively; Browsers used to depend on a flash plugin for playback, which has been phased out over the last couple of years.

HLS was created for using existing HTTP CDNs for delivering live streams. It scales great, but latency can suffer in many implementations. Bitrate can adapt depending on the viewers needs. Most devices can play HLS natively, or through a javascript player. This is probably the most cost-effective way of delivering video.

WebRTC is more focused for one-to-one streaming. Bitrate is adaptable, but not many CDNs support edge delivery over WebRTC – those that do cost quite a bit more than other solutions. WebRTC can be played back in most browsers and smartphones today, using javascript players. Open source solutions for native playback and broadcasting are also available.