paint-brush
What Is MPEG-DASH: Dynamic Adaptive Streaming Protocolby@rahulrana
206 reads

What Is MPEG-DASH: Dynamic Adaptive Streaming Protocol

by Rahul RanaMay 15th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Research says that in a few years, video content could make up most of the internet traffic, Video not accessible, Low Quality of Experience, Fragmentation, and Expensive. The most important challenges that I have perceived so far are, as a mobile user, I would experience a high-quality video. The network operators are in demand to offer quality experience affordably. Achieving this interoperability will serve us as an instrumental for the growth of the market, because a common ecosystem of content and services will be able to provide a set of devices like TVs, PCs, laptops, set-top boxes, and phones.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - What Is MPEG-DASH: Dynamic Adaptive Streaming Protocol
Rahul Rana HackerNoon profile picture

Research says that in a few years, video content could make up most of the internet traffic, Video not accessible, Low Quality of Experience, Fragmentation, and Expensive. One major reason is, these days, each business platform has its obvious format, content formats and streaming protocols. We all are aware that the usage of Mobile Internet is exclusively expanding, and video traffic is also growing exponentially. The most important challenges that I have perceived so far are,

As a  mobile user, I would experience a high-quality video

The network operators are in demand to offer quality experience affordably

To avoid these problems, it is upright for me to provide interoperability between different servers and devices. Achieving this interoperability will serve us as an instrumental for the growth of the market, because a common ecosystem of content and services will be able to provide a set of devices like TVs, PCs, laptops, set-top boxes, mobiles phones, and tablets. Thus, the era of MPEG-Dynamic Adaptive Streaming (DASH) was launched as a solution.

WHAT IS DASH (Dynamic Adaptive Streaming over HTTP)

DASH is stated as not a system or a protocol or a demonstration or a codec or interactivity or a client specification. It is stated as an enabler, and hence in-built provides formats enabling efficiency and provides high-quality delivery of streaming services over the Internet. DASH is considered as one component for end-to-end service, where the System definition is managed by the other organizations like Fora, SDOs, etc.).

Design of DASH:

  • Helps us to reuse the existing technologies (containers, codecs, DRM, etc.)
  • Enable the deployment on top of HTTP-CDNs (Web Infrastructures, caching)
  • Enabling very high user-experience (low start-up, no rebuffering, trick modes)
  • Enabling selection based on network and device capability, user preferences
  • Permits seamless switching, live and DVD-kind of experiences
  • Aids us to transfers intelligence from network to the client, enabling client diversity
  • Enables deployment elasticity (e. g., live, on-demand, time-shift viewing) and delivers modest interoperability points (profiles)

WHAT IS MPEG-DASH?

Dynamic Adaptive Streaming over HTTP (DASH) is also called MPEG DASH. The analysing through various sources found that just like Apple’s HLS encryption Solution, MPEG-DASH is the adaptive bit-rate HTTP-based streaming solution. It enables high-quality media content over the internet delivered from the convectional HTTP web servers by applying the technique of adaptive bitrate streaming.

Breaking the content into a sequence of segments results in the working of the MPEG-DASH, which are served over the HTTP. Those broken segments contain a short interval of playback time of content that is potentially many hours in duration. For example, such as movies, live broadcasts, etc. Unlike HDS or Smooth Streaming, DASH is a code-agonistic, leading the usage of content encoded with any format like H.264, VPG, and so on.

MPEG-DASH technology was developed from MPEG technology. The work of MPEG-DASH was started in 2010, and it was drafted by DIS (Draft International Standard) in January 2011 and launched as IS (International standard) in November 2011. Finally, it got issued in the April 2012 and recently revised in the year 2019 as MPEG-DASH ISO/IEC 23009-1:2019.

Some of the technologies like Adobe Systems HTTP Dynamic Streaming, Apple Inc. HTTP Live Streaming (HLS) and Microsoft Smooth Streaming are related to DASH technology.  It is based on Adaptive HTTP streaming (AHS) and HTTP Adaptive Streaming (HAS).

It has features of major streaming and media companies like Google, Microsoft, Ericsson, Adobe, etc. DASH also creates guidelines according to its purposes for different use cases in practice. And, it is also integrated with other standards and supported in other versions of various devices.

MPEG-DASH Capabilities:

  • Enabling live, on-demand, and time-shift services.
  • Allows independence of request sizes and segment sizes
  • Following are the segment formats:
  • ISO base media formats – ISO BMFF or MPEG-2 TS
  • ISO BMFF - It is a File Format (extensions)
  • MPEG-2 TS – It is a Transport Stream (extensions)
  • Follows some Guidelines for integrating any other format
  • MPEG DASH is a Codec independent
  • Server/client component synchronization (e.g., separate, and multiplexed AV) is Supported.
  • Enables targeted ad insertion and content descriptors for protection, accessibility, rating, etc.
  • Purposely defines quality metrics.

MPEG-DASH uses MPD (Media Presentation Descriptor) and Index Information as metadata for accessing the DASH Access Client. The MPD is the description of accessible segments and corresponding timing.

WORKING OF MPEG-DASH:

MPEG-DASH works on the technique of Adaptive Bitrate Streaming

Adaptive Streaming Concept:

AST- Adaptive Streaming technologies enable an Optimal streaming video watching experience for a diverse range of devices over a broad set of connection speeds.

AST shares the creation of numerous files from the same source file to distribute to audiences inspecting on dissimilar powered devices via different connection speeds

The files distribution adaptively changes the stream that is delivered to adapt to changes in effective throughput and available CPU cycles on playback stations

A transparent operation is provided to the user allowing the viewer to click on one button and all streams switch/adapt are sensed behind the scenes.

TYPES OF APPROACHES IN ADAPTIVE STREAMING TECHNOLOGIES:

I can see three (3) approaches for Adaptive Streaming which is more effective and impressive, and they are listed below:

Approach 1 for Adaptive Streaming:

Initially, the high important video information is sent by the Server (e.g., Iframes). On sending the high-importance video information, the lower-priority video information follows (e.g., P and B frames) only if bandwidth and time allow.

Approach 2 for Adaptive Streaming:

Only the high-quality part of the frame is sent by the server progressively. The remaining frame details are sent only if the bandwidth and time are permitted.

Approach 3 for Adaptive Streaming:

The server video is encoded in multiple bitrates and depending on the device, the bandwidth is adjusted according to the rate of other approaches that exist.

Let us consider a Real-time example:

When I desire to watch the online video anywhere, anytime on any device and, I need videos to be streamed without interruption in the highest resolution possible. While streaming videos at the highest resolution, videos look great but when bandwidth conditions are worsened, for example, network changes from wi-fi to 3G, the demands of high-resolution video overload a viewer’s connection. To address or solve this problem, content delivery networks (CDNs) and video providers are turning to Adaptive Bitrate (ABR) Streaming.

Adaptive Bitrate Streaming:

Currently, I am using Adaptive Bitrate Streaming to detect an available bandwidth in real-time and fine-tunes the video stream accordingly to deliver the best possible picture quality. A dynamic change happens on the bandwidth to higher and lower levels based on the availability of the resources. Though the concept is generally the same for every resource, I can see several different flavors of Adaptive Bitrate Streaming technology. Adobe uses the technique of HTTP Dynamic Streaming; Microsoft takes up the technique of Smooth Streaming and Apple goes with the process of HTTP Live Streaming.

ADOBE - HTTP DYNAMIC STREAMING:

  • Robust, scalable delivery
  • Support for standard HTTP caching systems
  • Unparalleled reach
  • Open-source file specifications
  • Adaptive bitrate
  • Support for standard HTTP caching systems
  • Live or on-demand streaming support
  • Multiple video codecs

MICROSOFT – SMOOTH STREAMING:

Smooth streaming allows me to get minimal buffering and fast start-up time, fulfilling the consumers’ requirements by adapting the video stream quality in present, depending on the consumer's varying bandwidth and CPU settings.  

The adaptive streaming of content is made possible, allowing on-the-fly compensation for changes during playback by the application of a Multi-bitrate Smooth Streaming.

APPLE – HTTP LIVE STREAMING:

HTTP Live Streaming (HLS) technology from Apple, sends live and on‐demand audio and video to iPhone, iPad, Mac, Apple Watch, Apple TV, and PC.

HLS permits the deployment of content using ordinary web servers and content delivery networks using the same network source.

With the available speed of wired and wireless connections, the playback of video is optimized. Videos are designed for reliability and vigorously adapt to network conditions as per the speed of data.

WORKING OF ABR:

Working of ABR needs first, a video is encoded at different bitrates to accommodate varying bandwidth connections. Every bitrate version is divided into small fragments, characteristically 2 to 10 seconds.

The online video player pulls the fragments from different encodings and inserts them into the stream as bandwidth dictates.

The result of this process is faster video start times and a continuous, uninterrupted video experience, which results in happier viewers and content providers. Thus, it also fallouts in a major measurement challenge.

A few years ago, I used to measure a long movie and play it using a single file. But with ABR, I am measuring a movie in aggregate over a large viewing audience that requires analysing tens of thousands of separate files.

Now, let me consider the sheer volume of ABR files coursing over a network with 150 channels. The number of fragment files over this network is equal to the tens of millions for a single bitrate. Further, I can add several more bitrates into the existing equation and now the number of Adaptive Bitrates Streaming files would progressively increase into the hundreds of millions. Hence, this level of fragmentation turns over the measurement data into a lot of unusable noise.

Finally, WHY I NEED MPEG-DASH?

In recent days, due to emerging of new technologies and growing competencies, I am observing that few industries are moving quickly to provide solutions based on MPEG-DASH. In addition to this, it also includes some open-source implementations. It’s firmly believed that in the upcoming years, it will be a crucial time for the industries to include content, providing service and platform, software vendors, CDN providers, and device manufacturers to join hands with this standard and actively cope up in building an interoperable ecosystem for multimedia streaming over the Internet. According to me, below are the top-most characteristics of MPEG DASH which I have experimented:

  • Provides sufficient flexibility
  • Rich and simple at the same time
  • Easy to understand more detailed market needs
  • Profiles are created by considering the user requirements
  • Integration of MPEG-DASH collaborates with system creators
  • Usage of HTTP web server infrastructure delivers all essential contents overall World Wide Web content
  • Supports both un-chunked and chunked
  • Supports both separate and combined AV
  • Index format is obscure for effectual byte-range process 
  • For w/common encryption ISO base media file format is applied
  • Usage of the stream and track annotations currently includes the best candidate for an open standard for adaptive streaming