Blogs

By Raul Aldrey, Chief Product Officer, MediaKind

We at MediaKind have been driving the digital evolution of media for over 25 years. It’s amazing to see the pace at which this is accelerating, especially over the last 12 months. For Sports, the digital era is ushering in a new way for casual and super fans alike to access and deepen their engagement with their favorite sport.

The opportunities for creating value and sustainable growth for leagues, fans, and rights-holders are huge. Digital is disrupting the traditional – everything from enabling more diverse access to core digital services, boosting sponsorship and content reach, strengthening the fan community, and enhancing the value of your sports’ rights.

But there are challenges ahead in tackling this disruption; foundationally there is a need to connect the components together. We need to develop a robust workflow that can get that content to the cloud and ensure that the streams are delivered reliably, in the highest possible quality, and at scale.

Handling high volumes of sports streaming traffic

I was delighted to explore some of these themes during the recent Sports Pro Media OTT Summit USA event, alongside Chris Xiques, VP, Digital Video Infrastructure, ViacomCBS; Brandon Farley, SVP & Chief Revenue Officer, Streaming Global; and Jeff Gilbert, Consultant.

One of the early topics of conversation was around the ability of CDNs to handle high volumes of live sports traffic. As I pointed out, it’s important to differentiate between a large-scale event such as the Super Bowl, which is typically built to manage enormous capacity levels against live, pop-up events, with moderate and predictable patterns broadcasters can offset through capacity planning. In either instance, the key is to ensure all the necessary resources are in place to orchestrate and negotiate spikes in traffic – anything from payment gateways, clients, and handling any number of exponential requests on demand.

Of course, these resources occasionally fail. And as Jeff Gilbert excellently put it – “is there a way to fail gracefully? And if so, how do you plan for that?” For me, it’s about triaging the right elements when you reach the choking point. Constant decisions need to be made – do you maintain or degrade an experience? There are times when it’s easier to reset the entire system, and there are others where you need to make snap decisions based on requests from the web player or the mobile experience. The best decisions are made when the broadcaster has full visibility of all the chain’s live elements. That way, it’s possible to plan for the individual transfer of individual components, mitigate risk and ensure the viewer receives a good quality of experience.

What does the quality of experience mean today?

The first word that springs to mind is ‘consistency’ – something every platform or service must start with to ensure the experience is maintained. When we speak of ‘quality of experience,’ it extends beyond the fundamentals – picture and video – to include audio, overlays, and metadata, which also hold enormous importance. It’s a cultural shift. When we think of live events, we need to start thinking fan-first; the experience must be built around fan engagement, whether it’s at the venue itself or in terms of a richer multiscreen offering for viewers at home or on-the-move.

One example is the way we deliver multi-camera angles. It’s one thing to provide multiple views of a game, but quite another to synchronize the feeds with high frame accuracy. Fans need confidence that switching between these camera angles won’t see them miss any action.

We could reach a stage where multiple ABR streams are delivered to the end-user while the original feed is buffering. If our industry can implement and execute that level of experience, it will have enormous ramifications for monetization platforms such as sports betting.

It will also open the door to several sports leagues to monetize their content; in MediaKind’s 2021 Sports D2C Forecast, we discovered that only 8% of the 40 rights-holders analyzed had engaged with some form of advertising on their OTT service. While ad-insertion and product placements are ubiquitous in the broadcast space, the streaming market’s next big opportunity is in micro-transactions. For that, we need to ensure we have the right technologies in place to ensure that when the content is delivered, frames are in sync with the various metadata and that the viewer sees it in real-time.

Latency: Streaming at scale

Real-time viewing is the utopia for sports streaming, but in terms of what is achievable today, I think we have to ask when looking at things such as multiple camera angles, “can we attain a predictable and controllable latency?”

The secret sauce comes from the signal acquisition at the venue. From the moment the signal is encoded, it’s about finding ways to manipulate it, creating an inbound channel, and data stamping. When the client ultimately receives different streams from multiple camera angles, it’s, therefore, possible to synchronize them. The question is around intelligence within the workflow – and we’re probably only able to reduce time-lapse by just a few milliseconds ultimately. But lining up the various frames and the timestamps associated with it is essential to creating and maintaining consistent and accurate live sports viewing experiences.

I enjoyed listening to Chris Xiques describe ViacomCBS’s experiences of streaming this year’s Super Bowl. Unlike a traditional production process – which would involve a physical ‘war room’ of 100 people – this year’s edition relied on 100 people in their living rooms: “and if they had a bad internet that day, we were going to have a bad Super Bowl!” He explained how CBS adopted two-second transport segments, which meant the streaming latency was reduced from 35-40 seconds in 2019 to just 12 seconds last month. The cable broadcast was delivered at a 10-second latency – and for me, this demonstrates how quickly we are evolving in this space.

Although the Super Bowl is an outlier for large-scale streaming, the expertise and technologies are now available to implement this quality of delivery across the ecosystem. Many lessons can be extrapolated from the broadcaster and operator worlds and be applied directly to the streaming space. To take live streaming to the next level, it’s now about achieving operational readiness and fine-tuning current architecture environments.