Next steps in streaming: The drive to reduce latency

Next steps in streaming: The drive to reduce latency

By Tony Jones, Principal Technologist, MediaKind March 25, 2020 | 4 min read
5G, ABR, All-streaming, Bandwidth efficiency

Last week I had the pleasure of taking part in a virtual ‘fireside chat’ alongside John Moulding, Editor-in-Chief, Videonet, as part of the Connected TV World Summit. Together, we discussed a number of subject matters from the streaming world, including developments in compression technology; unicast vs multicast delivery; methods of reducing latency; and the possibility of interoperable standards. Below, I provide a summary of some of the main points from our discussion.

Where are we with compression in the era of 5G and smart WIFI?

The development of 5G and smart WIFI presents interesting use cases. The availability of increased capacity means we can vary the application of the additional capacity, depending on whether it’s fixed, wireless or mobile. Of course, demand for capacity significantly outstrips the increase in provision, and that means the need for more efficient delivery isn’t going away any time soon. When you look at the growth of 4K content, we can see compression development will continue for a long time to come.

Each generation of coding standards has offered nearly 40% theoretical improvement from the original referencing code and that’s true of next generation codes as well. Yet, within the given encoding standards, the implementation for the encoder is almost as important as the standard itself. As I referenced in a blog post last year, it’s still possible to reduce the bitrate using the same encoding standards – even with MPEG-2!

Today of course, the vast majority of streamed content uses MPEG-4 AVC, and through ongoing improvements with video compression formats, we can expect further improvements to the viewing experience. HEVC is well-established and indeed, in most client devices. The standardisation of EVC and in particular VVC, will be especially interesting, as it takes another 40% step down in bitrate requirements from HEVC. More here.

Multicast vs Unicast: Where is the media industry heading for live/linear streaming?

This was an interesting question! Of course, in a wired environment, where you have the ability to do things on the network, multicast will play an increasing role for specific cases that demand high concurrency. It’s a really interesting area for the media industry in the coming years but it won’t be something that can be adapted to all environments. Unicast will remain prevalent for handheld devices and in situations where the management of the network isn’t an option.

The streaming market is a complex one, and we may well see a split in the technologies used. There are technical and operational reasons why a full switch to multicast will be problematic. First, 5G bandwidth efficiency on multicast is simply not as good as unicast delivery if you’re going from point to single point. Multicast over WIFI has some limitations too, although multicast over WIFI does work as successfully in the home as unicast. However, in a live environment, the ability to deliver multicast UDP and deliver onwards on to a client, is clearly beneficial.

Broadcast quality streaming: the drive towards low latency

Last year at IBC, we launched our optimized AV solution to enable low latency for ABR delivered live content. Key to the solution is its ability to reduce glass-to-glass latency delay from 45-60 seconds to 3-7 seconds – the same level as broadcast-type delivery. While there is always demand for lower latency, 3-7 seconds ensures the content passing through the streaming path will be uninterrupted by the likes of very keen social media users. There is very little chance a live, real-time tweet could be written and processed faster than the live streamed data onto your screen – so rest assured, the result of the next Olympic 100m final should remain for your eyes only in a fully optimized end-to-end chain.

There are two aspects at play with our solution. The first is the use of Common Media Application Format (CMAF) Low Latency Coding with HTTP/1.1 Chunked Transfer Encoding, which eliminates some of the unnecessary delays between, for example, a packager and a CDN. The original ABR format meant that the entire segment had to be completed in order to know the length of it, and was needed to be signalled for transfer to the subsequent stage. Given each individual segment equates to 6-10 seconds of latency (perhaps 4 seconds for a smaller segment length), it’s a significant issue and one that accumulates for each stage. The combination of CMAF low latency and chunk transfer encoding enables you to move data along the delivery chain at an earlier stage, as you are only waiting for a piece of that data to be ready.

The second key aspect of our solution is our ‘direct path technology.’ This radically changes the way data is connected from the encoder to the packager, removing buffer delays. Traditionally we would have applied buffering and turned the encoder’s output into a multi-rate constant bitrate type connection delivered in real-time. Now, however, we can take the data for each picture as soon as it is available – thus eliminating a significant amount of latency.

Is there a burning need for interoperable standards in the streaming market?

There is no argument that agreed standards across the industry is a desired outcome – particularly for us at MediaKind. However, the need for a scalable, low latency solution for streaming is needed across every stage of the delivery chain – including the operator stage – and this appears to be missing from projects such as the DVB working group. If we are to migrate towards a more standards-based approach and fully ubiquitous standards, we need to avoid a toolbox approach within the whole chain to ensure that it is fully open.

Another issue to consider is the contradiction between multicast ABR and low latency; multicast ABR inherently adds latency that you didn’t want. The issue of latency is most important in exactly the same environment where scalability is at its peak – i.e. an enormous sporting event with many millions of people watching simultaneously. With high value content also comes the high expectations of quality of delivery.

Which is why, when John Moulding closed our discussion with a question around the possibility of an all-streamed World Cup or Olympics within the next five years, I suggested a 10-year time frame as a more tangible possibility. While all-streaming may be available to all viewers within five years, it seems highly unlikely that it will be the only format within this time frame. The ability to deliver at scale in front of a large number of people must balance the expectations of high quality and low latency – so the possibility of all-streaming such an event seems improbable in the short-term.


Related Posts