Internet Video Streaming — ABR part 2

Eyevinn Technology
7 min readMar 14, 2018
Written by: Boris Asadanin, Streaming Media Consultant at Eyevinn Technology

Background

In the first part of the OTT — video streaming over the internet article we covered the basics of ABR streaming. We have seen that in the ABR streaming principle we have a video file being encoded into various quality levels, bitrates, or as it is more commonly referred to: profiles.

Live ABR content streaming adds a little more complexity since the manifest file content is floating — new segments are continuously produced. This article explains live ABR streaming together with recording and storing ABR content. The final chapters define the main ABR formats today together and general ABR disadvantages.

This publication is part of a series of articles describing the principles of the technology behind video streaming. It could be read without any prior knowledge on the subject.

Glossary

  • ABR streaming — Adaptive bitrate streaming
  • Segments — parts of a video file
  • Manifest file — file containing references to each file in the collection of files that are parts of the same video asset
  • Profile — one quality/bitrate level among several for a video file

Live ABR video streaming

So far we have only mentioned videos which have a fixed length and therefore a finite number of segments that can easily be referenced in a manifest file. But what about live content?

Live content may be defined as a video of infinite length without a formal start or stop time. Imagine a real live content like a traditional broadcast tv channel. New segments are produced continuously. What about the manifest file then?

In live ABR streaming new manifest files are being produced continuously together with new segments. The manifest file contains references to a given number of segments (depending on the setup, segment length etc). As new segments are produced new manifest file are produced too with references to the new segments.

Fig 1: Manifest files referencing subsections of a live ABR stream.

Note that the set of referenced segments are overlapping ensuring that the client always knows what segment is the next in order without dropping any in between manifests.

Live stream example. The downloaded files are coloured green. From left to right:

Fig 2: Live stream example. Downloaded files are coloured green.

As depicted in figure 2, the client switches to the channel by downloading the current manifest file referencing only a few segments. After having downloaded a few segments, the client downloads the next manifest file referencing new segments, and so on.

Recording and storing ABR streams

Recording an ABR stream is actually quite simple now having our newly acquired knowledge. The recorder works exactly like a client player but with one important difference. The recorder must download and store all segments from all profiles.

Fig 3: Recording session starting with Manifest n. All segments downloaded.

All segments are downloaded to enable streaming sessions at any quality level.

Storing ABR Recordings

ABR content is stored on disk differently between formats and format versions. In the early days it was common to simply store each video segment and manifest file separately in flat folder structures. This method soon proved to be unsuitable because each video content would consist of tens of thousands of small files. This induced excessive wear and tear on the disks as well as unnecessary latency in streaming, not to mention the maintenance nightmare. More recent ABR storing solutions involves combining segments into one file. Some solutions even combine all segments with the manifest file, where the latter references each segment as a byte offset rather than a unique segment file. In this video streaming session case the client would start by downloading only the initial manifest part of the large content file, and then segments as byte offsets of the same file.

Fig 4: One file ABR asset on disk. Initial manifest file referencing each segment as a byte offset within the file.

Note: Available profiles stored in separate files.

In today’s streaming services there are a few more profiles than three. The reason is that today we are targeting so many more device types, everything from 60” HDR TVs connected to 100Gbps home networks to a 4” mobile screen on a 3G connection. Watching a low quality video on your 60” screen TV would be so bad that you wouldn’t be able to see the actual video. Vice versa, the HDR material suited for your brand new TV set is probably a complete waste of data transfer for your 4” screen phone. Probably the connection is too weak as well.

Various ABR Formats

As mentioned in part 1 of this tutorial, the ABR streaming principle broke through to consumer electronics in 2009 when Apple and Adobe both launched their own versions of ABR streaming. Since then a few more ABR streaming formats were launched but the main and important ones are the following:

  • HLS (HTTP Live Streaming): Apples format supported by Apple devices including iPhones, iPads and Mac computers. With its device reach, HLS is one of the dominating formats today and will probably stay so within the foreseeable future.
  • HDS (HTTP Dynamic Streaming): Adobes format supported by Adobe Flash player which was a common addon to internet browsers and available in Android devices. Without a natural client device support, and as the Flash player has lost its importance as browser add-on, HDS is considered dead today even if the format is still in use here and there.
  • MSS (Microsoft Smooth Streaming): Microsofts ABR format supported by clients built on Windows Phone 7 or on the Silverlight SDK. Having lost the dominance on the device market to Android and iOS, and with MPEG-DASH emerging, MSS is also on discontinued development status. It still remains an active format mainly on PCs and their IE browsers but will soon be overtaken by MPEG-DASH
  • MPEG-DASH: Format developed by MPEG Standardization Group to provide a standardized ABR format to bring confidence to the market. Using the standardized MPEG format would enable streaming industry players to record, store, and stream one format only to any client device. Together with HLS, MPEG-DASH is considered the dominating format which is being deployed in the streaming industry today.

These formats are all built by the same ABR principle explained in part 1, but they have quite a few differences as well. This tutorial doesn’t cover the details of each ABR streaming format, but they all differ in what video and audio codecs they support, functionalities for DRM, ad insertion markers and other important functionalities.

While it is possible today to build application support for any format on any device, it’s hard to make it as optimal as the native format. As each format is also closely tied to a DRM technique, the effort would simply be adverse. Instead, the main format players are today slowly converging with the MPEG-DASH standard and within a near future MPEG-DASH will be supported by all major device vendors. Owning one of the two dominating formats today, the question is still when, if not IF Apple will start supporting MPEG-DASH. However, that is also beyond the scope of this tutorial.

Disadvantages with ABR streaming

Throughout this tutorial series ABR streaming has been explained quite positively. Even though the ABR principle has brought effective and qualitative ways of streaming live video content over the internet, there are some disadvantages as well. This section describes some of these:

  • Storing multiple bitrates
  • High over-head traffic — chatty protocols
  • High latency
  • Multiple formats

Storing Mutiple Bitrates

As mentioned initially, and of course depending on what client devices that are targeted, usually more than three profiles are required for ABR streaming. All these profiles must be created and stored which requires lots of encoding resources, storage, and maintenance.

High Over-Head

As we have seen throughout this tutorial series, ABR video streaming is pull based for each separate segment which means a lot of overhead HTTP traffic. Additionally, HTTP is used instead of more lightweight protocols to provide quality by assuring that each single byte is delivered correctly.

High Latency

The segment lengths in combination with large client buffers (further explained in the Client Player tutorial part), and the complexity of encoding multi-profile live content, results in high latency. Latencies of 30–40s is common. The latency can be lowered quite simply by minimizing segment lengths, but only down to around 10s before quality starts to suffer. For live sensitive content like sports and betting, this latency is unacceptable.

Multiple formats

The various ABR streaming formats supported by different client devices is still today a huge problem. While the industry starts seeing MPEG-DASH as the emerging primary format, there are still quite a few technical aspects that have to be resolved before industry players can encode, record, and stream content in one format only.

Final Words

Almost done! Having now described the ABR streaming formats and their advantages and drawbacks, part three explains some activities that bring down the latency and some alternative solutions to OTT streaming.

Eyevinn Technology is the leading independent consultant firm specializing in video technology and media distribution, and proud organizer of the yearly nordic conference Streaming Tech Sweden.

--

--

Eyevinn Technology

We are consultants sharing the passion for the technology for a media consumer of the future.