Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How might we handle adaptive streaming in a video info.json? #57

Open
jronallo opened this issue Nov 5, 2016 · 7 comments
Open

How might we handle adaptive streaming in a video info.json? #57

jronallo opened this issue Nov 5, 2016 · 7 comments

Comments

@jronallo
Copy link
Contributor

jronallo commented Nov 5, 2016

This question is a specialization of how we are handling progressive download videos by listing them out as individual sources. You can see an example of listing sources in #50

With adaptive streaming like HLS and MPEG-DASH what information should we include for a source? Each of these formats basically points to a document which is a manifest (media presentation description) of all the different adaptations that are available for a video. The client can then select which to play based on bandwidth and resolution and switch sources mid-stream based on conditions. In some cases a single file is used for each adaptation (on demand profile which then uses byte-range requests for segments) and in others the source video at different bitrates and resolutions are each pre-segmented (live profile).

The assumption is that for some cases multiple sources are still likely to be listed even with an adaptive version available. This could be because both an HLS and MPEG-DASH stream are made available to reach more devices. (Currently these both use different underlying media formats and so work on different platforms--though the latest HLS allows for fragmented MP4s as MPEG-DASH has been using.) The listed sources could also include a progressive download source to work for clients that don't support the adaptive formats and that can't run a client-side library (e.g. hls.js and dash.js).

I could see this going two ways. In both cases we would list the URL to the media presentation description document as one of the sources. We would list out information relevant to the source at a high level like format. But then we could either stop there, or we could also list out the available adaptations within each adaptive streaming source. Basically the server creating the info.json for a video could look inside the media presentation document and pull out information about each adaptation. This seems duplicative but maybe useful to have this information all with the single request for the info.json? Would it ever change the decision about whether to play an adaptive stream based on which adaptations are available?

Though it isn't required it is also common, recommended practice for the audio adaptation(s) to be separate from the video adaptation(s). This means that if we were going to describe the adaptations for a source we would be describing the technical details of both the audio and video which could either be separate files or multiplexed.

Thoughts on this? Other ways we could handle adaptive bitrate streaming in a video info.json?

@jwd
Copy link

jwd commented Nov 7, 2016

Thanks for writing this up, Jason. My view would be not to make things too complicated and to follow your suggestion of just having an entry in the info.json pointing to the adaptive streaming manifest (M3U8 or MPD) along with as many properties as are common to and known about the sources referenced by that manifest. I agree it should be possible to list adaptive sources alongside standalone sources.

@azaroth42
Copy link
Member

Completely agree with Jon -- lets start simple and get traction first. That would mean simply pointing to the streaming manifest, with the same basic properties as for the other formats.

@jronallo
Copy link
Contributor Author

jronallo commented Nov 8, 2016

Sounds good to me. I'll work on adding an example to a my prototype.

@azaroth42
Copy link
Member

(outsourced as Prezi can just refer to HLS, and clients use the format specific info)

@azaroth42
Copy link
Member

Propose close, out of scope / solved by adaptive formats already

@bvibber
Copy link

bvibber commented Feb 15, 2017

Agree with close / out of scope.

@zimeon
Copy link
Member

zimeon commented Mar 7, 2017

This issue came up on 2017-03-07 AV call in the context of a set of streams at different bitrates from Avalon. Agreement on call was that adaptive streaming based on these is out of scope (use a format that supports it instead) and that the use case is really about user selection (where label for each stream indicating difference is adequate).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants