The broadcast experience for live content, especially sports, has subtly changed since the first TV transmissions in the late 1930s. The live linear spectacle with fixed camera positions has progressed from black-and-white, through color to ultra-high definition (UHD)/ High Dynamic Range (HDR). Innovations such as video assistant referees (VAR), sky-cams and even drones have added to the sports production arsenal in recent years but the way a viewer watches a live match or event has been traditionally pre-ordained by the director’s gallery.
Unique, immersive and personalized media experiences are central to today’s consumer expectations. New technologies are being developed to augment and supplement regular broadcast and VOD offerings, many of which target second screens such as mobiles and tablets. This is starting to change the way we offer content to viewers. With applications at live events and delivery in synchronization with standard live broadcast, 360-degree video represents one way of meeting consumer demands. There has already been extensive testing and commercial trials underway in some markets for paying customers.
Yet delivering high quality 360-degree video to second screens presents challenges in optimizing the use of last-mile bandwidth and working within the constraints of processing capabilities. Last December, we made a pivotal breakthrough, working alongside several key partners to enable the world’s first live multi-channel 6K tiled 360-degree stream of a basketball match in Germany directly to consumers. The project used viewport adaptive 360-degree video streaming to deliver 6K x 3K quality video in 10-15 Mbps – a data rate that makes it suitable for a wide swathe of devices from VR headsets to smart-TVs. This achievement received widespread recognition at this year’s NAB Show, winning both the 2019 NAB Technology Innovation Award and the IABM BaM Award for best ‘Project, collaboration or event’.
This proof-of-concept has rapidly evolved to form the foundation of our ‘solution for live events.’ This new innovative offering, which we launched at NAB Show 2019, provides a cloud-based workflow for live 360-degree video processing and multi-platform publishing services which enables consumers to become fully immersed and engaged in their favorite live content. This can be achieved through enhanced 360-degree video coverage, either directly via a head-mounted device, or viewed on a multiscreen device as a companion to existing HD or UHD broadcast or streamed services.
Live sport is a natural home for 360-degree video technology, with the added advantage that it can leverage the use of existing camera systems and workflow technologies that are already widely deployed. In addition, the core platform is designed to serve the next generation of VR headsets that offer the promise of a more comfortable and untethered experience – especially in combination with the arrival of 5G services.
On Wednesday June 19, I will be delivering two sessions at ConnecTechAsia, which will discuss the various methods of 360-degree video delivery, including the use of ‘viewport adaptive’ streaming technology’, how these methods scale with the number of users and the capability of second screen devices. I will also explain our experiences from deploying an end-to-end live tile-based 360-degree video workflow in the cloud for live sports events and provide insights for future applications. If you’re attending the event, please join me at my sessions at the Suntec Singapore, both in the Innovation Hub on Level 4, (13:00) and at the IABM Future Trends Theatre on the 6th floor, TV Exchange, (16:40). I look forward to seeing you in Singapore!