How should I encode content for Google TV using HLS? - http-live-streaming

HLS is a feature of Google TV 3.2, what's the best way to encode my content to be viewed on a Google TV?

How to Implement HLS for Google TV
Http Live Streaming aka HLS is a standard for streaming multimedia content (Audio and Video) supported by Google TV.
There are many cool features that come with HLS. The main ones are:
Adaptive Streaming - Automatically adapts to either congestion or bandwidth availability.
Resilience to transient network failures.
No special configuration for your server, routers, or firewalls. It’s just HTTP 1.1
Easily supported by Content Delivery Networks
Live streaming is supported (more in a longer article)
HTML5 video tag support in Chrome for Google TV.
Optional AES encryption (more in a longer article).
On Google TV, HLS is a standard protocol, you just put your url in any of the Media Playback API’s such as MediaPlayer, VideoView, etc. It just works.
Components of a HLS file
.m3u8 - Text based manifest or playlist file (may be updated for live content) - a variant playlist usually points to individual manifests that also end in .m3u8
MIME Type: vnd.apple.mpegURL or application/x-mpegURL
.ts - MPEG 2 Transport Stream - Typically 5-10 seconds long video & audio data.
MIME Type: video/MP2T
Creating content for HLS
The easiest way to create HLS content is to use Apple’s tools, the latest version of Sorenson Squeeze, Telestream’s Episode, and many cloud encoding providers. You start with content you encode at many bit rates. For Google TV, the first line in the .m3u8 file is the speed we start with. It’s probably best to pick the 1.2mbps stream.
Google TV supports HLS protocol version 3 as of Google TV firmware version 3.2.
Your content URL’s must have the characters “.m3u8” within the URL. If the URL doesn’t end with “.m3u8”, the system will make at least 2 requests before playback and the MIME type of the playlist must be one of “application/vnd.apple.mpegurl” or “application/x-mpegurl”.
Note - Google TV doesn’t currently support codec switching - so Ad segments must use the same encoding as the main content. Of course, developers can pause the HLS playback, play some other content, then resume the HLS playback again to get around this.
Encoding
Encoding content is as much an art as it is a science. The best choices are very much dependent on your content, what speed objects move against the background, and many other items that are too numerous to go into a simple post. It is also dependent on the devices you are targeting. The settings below are designed to be optimized for Google TV. Older devices may require different / additional encodings. Be aware that certain types of encoding for commercial purposes may require a license and/or the payment of royalties.
Audio Encoding should be consistent across all streams. HE-AACv1, HE-AACv2, AAC-LC up to 48kHz, stereo audio are all acceptable choices.
16:9 Aspect Ratio**
Total Video
Dimensions Bitrate Bitrate Encoding
640x360 640 600 HiP, 4.1
640x360 1240 1200 HiP, 4.1
960x540 1840 1800 HiP, 4.1
1280x720 2540 2500 HiP, 4.1
1280x720 4540 4500 HiP, 4.1
1920x1080 6040 6000 HiP, 4.1
1920x1080 8196 8156 HiP, 4.1
4:3 Aspect Ratio
Total Video
Dimensions Bitrate Bitrate Encoding
640x480 640 600 HiP, 4.1
640x480 1240 1200 HiP, 4.1
960x720 1840 1800 HiP, 4.1
1280x960 2540 2500 HiP, 4.1
1280x960 4540 4500 HiP, 4.1
Keyframe
The current Google TV implementation only uses the Keyframe at the beginning of each segment (For a 10 second segment at 30fps this would be every 300 frames). The Apple suggestion is to have a Keyframe every 90 frames. (or every 3 seconds at 30fps) Note - Framerate is a complex subject.
** Adapted from https://developer.apple.com/documentation/http_live_streaming/hls_authoring_specification_for_apple_devices

Related

Godot - Video Player frozen on launch

Good Morning Everyone,
I'm trying to use Godot for a very simple app.
Open to start screen with a looping video.
If button 1 is pressed, change scene, start video 2, return to start at end of video 2
If button 2 is pressed, ^ for video 3
I'm using WebM as my video sources. Sizes are 14.6 MB, 36.8 MB, 37.4 MB.
I have autoplay selected.
However, the video is frozen on frame 1 and no audio plays. the log prints True for is_playing()
Any advice?
Godot WebM support had many of issues, ranging from slow performance to crashes. Consequently it has been removed for Godot 4.0 to reduce maintenance cost (read more). Whether or not it will make a return as an official plugin is yet to be determined, but some kind of plugin is likely to be the path forward for WebM support in Godot.
Although Godot 3.x may still get WebM patches. I would encourage to convert the video to OGV. There are plenty of free tool that can do this conversion.

Chrome browser cannot play wav audio files with 1000 sampling rate

I can see that the browser has loaded a wav file with a sampling rate of 1000, but the canplaythrough event cannot be triggered. Manually clicking the play button cannot play the wav file (the wav file with a sampling rate of 8000 can be played smoothly). After downloading the wav file with a sampling rate of 1000, it can be played with the player provided by the computer. Does chrome have playback restrictions on audio files with low sampling rate, or where the front-end and browser settings are wrong? Is there any way to let the browser play the wav file with 1000 sampling rate smoothly. I hope you can give me some ideas.
It doesn't clearly state so, but according to the Web Audio API specification, audio buffers with sample rates between 8000 and 96000 have to be supported. This is for programmatically creating such a buffer, but it's to be assumed that the same system is used internally when playing <audio> tags.
It's up to browser vendors whether they want to support sample rates outside of this range, but they don't have to (and apparently Chrome doesn't).
Note there are more constraints and browser differences in regards to supported codecs and file formats, for details see this and this page.

Remove audio streams from a .m2ts video file

I have a video which has 3 audio streams in the file. The first one is English and the other ones are in different languages. How can I get rid of these audio streams without losing the quality of the video and the English stream.
I think ffmpeg should be used but I don't know how to do it.
Video
Bit rate mode: Variable
Overall bit rate: 38.6 Mb/s
Chroma subsampling: 4:2:0
Audio
Format: DTS-HD
Compression mode: Lossless

mpeg-dash and codecs specification

Looking at the article :http://www.streamingmedia.com/Articles/Editorial/What-Is-.../What-is-MPEG-DASH-79041.aspx
And it makes statements like:DASH is codec-independent, and will work with H.264, WebM and other codecs
DASH supports both the ISO Base Media File Format (essentially the MP4 format) and MPEG-2 transport streams
DASH does not specify a DRM method but supports all DRM techniques specified in ISO/IEC 23001-7: Common Encryption
But how is audio/video compression, or DRM method is specified in Media Presentation? Where cab i find more details?
DASH is a streaming protocol - the video stream is inside a 'container' and the container is broken into chunks and streamed. A very high level view of the video component is:
elementary video stream encoded with some codec
fragmented mp4 container (broken into chunks to facilitate ABR)
MPEG DASH streaming protocol
The mp4 container header information contains information about all the streams it contains - this will include the codec that it used to encode the stream (e.g. h.264 for a video stream).
ABR essentially allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions.
The DASH manifest (essentially an index file that contains pointers to the different bit rate streams etc) contains header information about the protections systems in use, for example Widevine or PlayReady DRMs.
The mp4 container also contains information about the protection system in a special PSSH (Protection System Specific Headers) header for the protection systems in use, for example again, Widevine or PlayReady.
Generally DASH streams will have the protection information in both places to ensure that all players can play the stream, but last time I looked, I think the spec strictly speaking says it can be in either or both.
The specs themselves are available here:
http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html (search for DASH)
https://www.iso.org/standard/68042.html - unfortunately, this one requires payment AFAIK. You can see a W3C spec which uses it here, however: https://w3c.github.io/encrypted-media/format-registry/stream/mp4.html
And there is a nice overview of DASH here:
https://www.w3.org/2011/09/webtv/slides/W3C-Workshop.pdf
And, of course, the classic reference to some of the drivers for DASH and similar standards:
https://xkcd.com/927/

How do I prevent video buffering on client device when delivered from IIS7

Background:
a. Running IIS7 on Windows 7 (MIME Type set up for mp4 as video/mp4)
b. Using Premier Pro I have exported a short video as H.264 with YouTube Preset 1280 X 720, 29.97fps
c. Resulting MP4 file is hosted on my IIS Server as A.mp4
d. I upload A.mp4 to YouTube; once processed, I then download both a medium quality and high quality version back and store it on the same IIS Server as Med.mp4 and High.mp4
e. Accessing the three videos via PC (Chrome), and two Android devices (Samsung Galaxy SII, and Samsung Note 8) A.mp4 consistently stalls during playback (I assume buffering) on all three devices whilst no such stalling occurs on Med.mp4 and High.mp4.
f. A.mp4 and High.mp4 comprise the following:
Stream 0
Type: Video
Codec: H264 - MPEG-4AVC (part 10) (avc1)
Resolution 1280x720
Frame Rate: 23.976040
Decoded Format: Planar 4:2:0 YUV
Can anyone advise what it is I am missing here? The files A.mp4 and High.mp4 seem to be identical in format and encoding and yet A.mp4 keeps stalling on all client test devices.
Many thanks and regards

Resources