Research says that in a few years, video content could make up most of the internet traffic, Video not accessible, Low Quality of Experience, Fragmentation, and Expensive. One major reason is, these days, each business platform has its obvious format, content formats and streaming protocols. We all are aware that the usage of Mobile Internet is exclusively expanding, and video traffic is also growing exponentially. The most important challenges that I have perceived so far are,
As a mobile user, I would experience a high-quality video
The network operators are in demand to offer quality experience affordably
To avoid these problems, it is upright for me to provide interoperability between different servers and devices. Achieving this interoperability will serve us as an instrumental for the growth of the market, because a common ecosystem of content and services will be able to provide a set of devices like TVs, PCs, laptops, set-top boxes, mobiles phones, and tablets. Thus, the era of MPEG-Dynamic Adaptive Streaming (DASH) was launched as a solution.
DASH is stated as not a system or a protocol or a demonstration or a codec or interactivity or a client specification. It is stated as an enabler, and hence in-built provides formats enabling efficiency and provides high-quality delivery of streaming services over the Internet. DASH is considered as one component for end-to-end service, where the System definition is managed by the other organizations like Fora, SDOs, etc.).
Design of DASH:
Dynamic Adaptive Streaming over HTTP (DASH) is also called MPEG DASH. The analysing through various sources found that just like Apple’s HLS encryption Solution, MPEG-DASH is the adaptive bit-rate HTTP-based streaming solution. It enables high-quality media content over the internet delivered from the convectional HTTP web servers by applying the technique of adaptive bitrate streaming.
Breaking the content into a sequence of segments results in the working of the MPEG-DASH, which are served over the HTTP. Those broken segments contain a short interval of playback time of content that is potentially many hours in duration. For example, such as movies, live broadcasts, etc. Unlike HDS or Smooth Streaming, DASH is a code-agonistic, leading the usage of content encoded with any format like H.264, VPG, and so on.
MPEG-DASH technology was developed from MPEG technology. The work of MPEG-DASH was started in 2010, and it was drafted by DIS (Draft International Standard) in January 2011 and launched as IS (International standard) in November 2011. Finally, it got issued in the April 2012 and recently revised in the year 2019 as MPEG-DASH ISO/IEC 23009-1:2019.
Some of the technologies like Adobe Systems HTTP Dynamic Streaming, Apple Inc. HTTP Live Streaming (HLS) and Microsoft Smooth Streaming are related to DASH technology. It is based on Adaptive HTTP streaming (AHS) and HTTP Adaptive Streaming (HAS).
It has features of major streaming and media companies like Google, Microsoft, Ericsson, Adobe, etc. DASH also creates guidelines according to its purposes for different use cases in practice. And, it is also integrated with other standards and supported in other versions of various devices.
MPEG-DASH Capabilities:
MPEG-DASH uses MPD (Media Presentation Descriptor) and Index Information as metadata for accessing the DASH Access Client. The MPD is the description of accessible segments and corresponding timing.
MPEG-DASH works on the technique of Adaptive Bitrate Streaming
Adaptive Streaming Concept:
AST- Adaptive Streaming technologies enable an Optimal streaming video watching experience for a diverse range of devices over a broad set of connection speeds.
AST shares the creation of numerous files from the same source file to distribute to audiences inspecting on dissimilar powered devices via different connection speeds
The files distribution adaptively changes the stream that is delivered to adapt to changes in effective throughput and available CPU cycles on playback stations
A transparent operation is provided to the user allowing the viewer to click on one button and all streams switch/adapt are sensed behind the scenes.
I can see three (3) approaches for Adaptive Streaming which is more effective and impressive, and they are listed below:
Approach 1 for Adaptive Streaming:
Initially, the high important video information is sent by the Server (e.g., Iframes). On sending the high-importance video information, the lower-priority video information follows (e.g., P and B frames) only if bandwidth and time allow.
Approach 2 for Adaptive Streaming:
Only the high-quality part of the frame is sent by the server progressively. The remaining frame details are sent only if the bandwidth and time are permitted.
Approach 3 for Adaptive Streaming:
The server video is encoded in multiple bitrates and depending on the device, the bandwidth is adjusted according to the rate of other approaches that exist.
Let us consider a Real-time example:
When I desire to watch the online video anywhere, anytime on any device and, I need videos to be streamed without interruption in the highest resolution possible. While streaming videos at the highest resolution, videos look great but when bandwidth conditions are worsened, for example, network changes from wi-fi to 3G, the demands of high-resolution video overload a viewer’s connection. To address or solve this problem, content delivery networks (CDNs) and video providers are turning to Adaptive Bitrate (ABR) Streaming.
Adaptive Bitrate Streaming:
Currently, I am using Adaptive Bitrate Streaming to detect an available bandwidth in real-time and fine-tunes the video stream accordingly to deliver the best possible picture quality. A dynamic change happens on the bandwidth to higher and lower levels based on the availability of the resources. Though the concept is generally the same for every resource, I can see several different flavors of Adaptive Bitrate Streaming technology. Adobe uses the technique of HTTP Dynamic Streaming; Microsoft takes up the technique of Smooth Streaming and Apple goes with the process of HTTP Live Streaming.
ADOBE - HTTP DYNAMIC STREAMING:
MICROSOFT – SMOOTH STREAMING:
Smooth streaming allows me to get minimal buffering and fast start-up time, fulfilling the consumers’ requirements by adapting the video stream quality in present, depending on the consumer's varying bandwidth and CPU settings.
The adaptive streaming of content is made possible, allowing on-the-fly compensation for changes during playback by the application of a Multi-bitrate Smooth Streaming.
APPLE – HTTP LIVE STREAMING:
HTTP Live Streaming (HLS) technology from Apple, sends live and on‐demand audio and video to iPhone, iPad, Mac, Apple Watch, Apple TV, and PC.
HLS permits the deployment of content using ordinary web servers and content delivery networks using the same network source.
With the available speed of wired and wireless connections, the playback of video is optimized. Videos are designed for reliability and vigorously adapt to network conditions as per the speed of data.
WORKING OF ABR:
Working of ABR needs first, a video is encoded at different bitrates to accommodate varying bandwidth connections. Every bitrate version is divided into small fragments, characteristically 2 to 10 seconds.
The online video player pulls the fragments from different encodings and inserts them into the stream as bandwidth dictates.
The result of this process is faster video start times and a continuous, uninterrupted video experience, which results in happier viewers and content providers. Thus, it also fallouts in a major measurement challenge.
A few years ago, I used to measure a long movie and play it using a single file. But with ABR, I am measuring a movie in aggregate over a large viewing audience that requires analysing tens of thousands of separate files.
Now, let me consider the sheer volume of ABR files coursing over a network with 150 channels. The number of fragment files over this network is equal to the tens of millions for a single bitrate. Further, I can add several more bitrates into the existing equation and now the number of Adaptive Bitrates Streaming files would progressively increase into the hundreds of millions. Hence, this level of fragmentation turns over the measurement data into a lot of unusable noise.
Finally, WHY I NEED MPEG-DASH?
In recent days, due to emerging of new technologies and growing competencies, I am observing that few industries are moving quickly to provide solutions based on MPEG-DASH. In addition to this, it also includes some open-source implementations. It’s firmly believed that in the upcoming years, it will be a crucial time for the industries to include content, providing service and platform, software vendors, CDN providers, and device manufacturers to join hands with this standard and actively cope up in building an interoperable ecosystem for multimedia streaming over the Internet. According to me, below are the top-most characteristics of MPEG DASH which I have experimented: