Video latency is a tricky thing, and it can really make or break your viewing experience. Take, for example, the Super Bowl. This year, if you opt to livestream the game as opposed to watching on broadcast television, take our advice and stay off social media. As with any livestreamed event, you’ll likely encounter a delay, anywhere from 20 seconds to a full minute, and you don’t want your Twitter feed spoiling the action.
It’s a scenario that’s all too common for livestream viewers, and it’s something video compression and delivery engineers work hard to circumvent, because here’s the thing: from an engineer’s point of view, the goal of livestream video is to achieve a healthy balance between quality and delay. But in an effort to reduce latency below a certain threshold, the quality of the video itself can become negatively affected, ultimately impairing the viewing experience.
This, then, forces broadcasters to decide between two options: do I want to stream video at the lowest latency possible, or do I want to focus on delivering high-quality video? Such a dilemma puts intense pressure on broadcasters to find a suitable balance between these two elements and position themselves as leaders in today’s competitive market.
But with our latest product innovation, Wave, broadcasters don’t have to make that decision at all. Now, they can complement their standard HTTP offering with Wave and create a layered approach to low-delay HTTP streaming, resulting in video that’s not just objectively high quality, but delivered with ultra-low latency as well.
In this blog, we’ll discuss:
- Technical aspects of Wave and how it works
- Challenges Wave helps alleviate
- Why the alternative use of WebRTC data channels provides the most viable solution for low-latency video delivery
But first, let’s start with some background.
Ultra-Low Latency: An Industry Focus
Recently, the streaming industry has put a great deal of emphasis on low-latency protocols, targeting a sub five-second, end-to-end delay—one that’s comparable to a live television broadcast. Such low delays are critical, especially for broadcasters who want to provide the best viewing experience possible for streaming audiences. Plus, achieving lower delays—even lower than standard television—allows broadcasters to take advantage of new monetization tracks like spot betting, and it helps mitigate those pesky social media spoiler alerts.
So, whether it’s a large-scale sporting event, online learning module, or interactive game application, low latency is key to keeping viewers tuned in and engaged.
However, current industry-standard, HTTP-based adaptive streaming protocols such as HLS and DASH make it a challenging endeavor to further reduce latency and achieve a healthy balance between video quality and delay. And there’s two good reasons for this:
- Segment-based delivery model
HTTP-based livestream technologies leverage a segment-based delivery model, which means a complete delivery of individual media segments is required prior to playback. Since the segment duration has a direct impact on video delay, engineers attempt to reduce latency by effectively splitting each segment into smaller, independently transmitted chunks, essentially creating a new format for packaging and delivering HTTP-based media. This is referred to as a common media application format, or CMAF, with Low Latency chunking.
- Potential for head-of-line blocking
With CMAF, the guaranteed order of packet delivery through an underlying transmission control protocol (TCP) creates potential for head-of-line blocking—a performance-limiting phenomenon that results in buffered network switches, out-of-order deliveries, and multiple requests in an HTTP pipeline.
Considering these facts, it’s clear using HTTP-based streaming protocols to achieve a sub-two-second, end-to-end delivery, all while maintaining top-notch quality, is a pretty ambitious effort, and it doesn’t always guarantee the kinds of results broadcasters aim for.
The Shift from HTTP to QUIC Protocol
Because traditional HTTP-based streaming protocols create potential for head-of-line blocking, resulting in the consequential need for relatively large buffers to mitigate high latency, engineers have had to switch gears and experiment with new low-latency protocols, namely, QUIC.
Originally developed by Google and standardized by the Internet Engineering Task Force (IETF), QUIC acts as a general-purpose transport lawyer network protocol, which improves the performance of connection-oriented web applications using TCP. With QUIC, engineers can eliminate head-of-line blocking and facilitate improved congestion control, prioritized delivery, and multiplexing—all critical components to drive ultra-low latency.
Following its standardization of QUIC protocol, IETF created a working group to study the use of QUIC for large-scale media transmissions, including one-to-one, one-to-many, and many-to-one applications. In doing so, the group was able to conclude an important finding. While QUIC was able to outperform predecessors in most cases, the same couldn’t be verified for existing adaptive streaming methods that have been highly tuned for HTTP/1.1 and 2, running on top of TCP. Consequently, this showed QUIC would require a mature custom application-layer protocol to reap all those benefits so important to large-scale media transmissions: ultra-low latency and high video quality. It’s this lack of a mature custom application-layer, combined with limited network compatibility, that makes practical deployments a tricky task for video engineers.
WebRTC Data Channels and the Innovation of Wave
Realizing the challenges in driving low-latency via HTTP-based and QUIC protocols,
our expert MediaKind engineers knew they had to find a new approach, a new protocol to deliver high quality video at a sub-two-second delay. They found their answer in WebRTC data channels.
As a free, open-source project, WebRTC (short for web real-time communication) enables web applications and browsers to capture and stream both audio and video media, as well as exchange arbitrary data between browsers without the need for an intermediary. Since this protocol supports audio and video streaming at ultra-low delay, it’s become a popular tool for video conferencing applications, where low latency is an absolute must to enable a fluid user experience.
However, for big events like sports broadcasts, WebRTC doesn’t pack the same punch, as it only supports basic compression and offers zero content protection. That’s when our engineers knew they had to find an alternative way to leverage WebRTC in order to drive ultra-low latency and quality video for large-scale livestream broadcasts. They found that answer in WebRTC data channels.
WebRTC data channels carry all kinds of arbitrary binary data, making it possible to carry video that’s been encoded via broadcast-grade encoders. Data channels also have the ability to leverage any codec at any profile while protecting valuable content through industry-standard DRM methods.
In making this discovery, our engineers were able to develop the essential foundation for MediaKind’s newest streaming solution, Wave.
Wave establishes a new benchmark in ultra-low latency, large-scale streaming and offers a highly viable solution for time-critical content delivery applications. Through Wave’s technology, engineers can encode and package multiple streams to a single point, then distribute those packaged streams to edge servers, finalizing a unicast delivery through WebRTC data channels. What really makes Wave so innovative, though, is its ability to deliver ultra-low latency at scale, combining multicast technology and WebRTC data channels, guaranteeing low latency transmission through UDP protocols, plus decreased player buffering. It’s a significant upgrade from legacy HTTP-based streaming that’s still required for time shift and backward compatibility.
“What’s really innovative is we offer Wave through WebRTC data channels, making it highly compatible with existing systems,” says Nelson Francisco, Principal Video Compression Engineer here at MediaKind. “Using traditional encryption methods, content can be delivered with high quality and low delay, while still being protected and secure.”
Take a look at some of the additional benefits Wave technology brings to the table:
- New immersive experiences and monetization models for large-scale events, including sports
- Expanded adaptive bit rate (ABR) to ultra-low latency streams
- Instant channel change for seamless viewing
- Multicast technology to stream targeted ads to millions of simultaneous live event viewers
- Leverages existing DRM and efficient delivery protocols, allowing secure, large-scale live distribution
Also, MediaKind’s Wave technology isn’t codec-dependent, so it allows broadcasters to deliver any kind of video with ultra-low delay.
But most importantly, Wave has the ability to achieve broadcast-quality viewing with industry-standard encoders and encryption, resulting in latency as low as two seconds. And according to Francisco, “That’s where the big innovation lies.”
Catch the Wave at ACM Mile-High Video Conference
Want to learn more about WebRTC protocol and our ultra-low latency streaming solution, Wave? Check us out at this year’s ACM Mile-High Video Conference, taking place on February 11th through the 14th in Denver, CO. MediaKind’s own Nelson Francisco will be on hand to talk all things Wave and the innovation of ultra-low latency delivery.
Planning on attending the event? Schedule a meet-up with one of our MediaKind experts to discuss your video streaming needs.