Why does HLS choose TS container? - http-live-streaming

I read HTTP Live Streaming Specification.
But I don't get Why HLS choose TS format as standard.
HLS is based on HTTP that is reliable communication. Thus, I think PS format is more appropriate than TS.
is There anyone who knows? technical history??

It was a choice apple made. They have not explained why they chose it. So anyone who answers, is just guessing.
I believe the most likely reason is Because it’s common to have native support for ts parsing in many video decoder chips.

Related

Web Audio live streaming

There is an audio stream which sends from mobile device to the server. And server sends chunks of data (due web-sockets) to the web.
The question is. What to use to play this audio in live mode, also there is should be a possibility to rewind audio back, listen to what was before..and again switch to live mode.
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Also, there is Web Audio API which supports by modern browsers, but I'm not sure does it possible to listen to audio in live mode and rewind audio back?
Any ideas or guides on how to implement it?
I considered such possibilities as Media Source API but it's not supported by Safari and Chrome on IOS, isn't it? But we need that support.
Then, you can't use MediaSource Extensions. Thanks Apple!
And server sends chunks of data (due web-sockets) to the web.
Without MediaSource Extensions, you have no way of using this data from a web socket connection. (Unless it's PCM, or you're decoding it to PCM, in which case you could use the Web Audio API, but this is totally impractical, inefficient, and not something you should pursue.)
You have to change how you're streaming. You have a few choices:
Best Option: HLS
If you switch to HLS, you'll get the compatibility you need, as well as the ability to go back in time and what not. This is what you should do.
Mediocre Option: HTTP Progressive
This is a fine way to stream for most use cases but there isn't any built-in way to handle the stream seeking that you want. You'd have to build it, which is not worth your time since you could just use HLS.
Even More Mediocre Option: WebRTC
You could switch to WebRTC for streaming, but you have greatly increased infrastructure costs and complexity. And, you still need to figure out how you're going to handle seeking. The only reason you'd want to go the WebRTC route is if you absolutely needed the lowest latency.

If I can't use WebRTC, what can I use right now for live streaming video

I'm working on a web app in node.js to allow clients to view a live streaming video via a unique url that another client will broadcast from their webcam, i.e., http://myapp.com/thevideo
I understand that webRTC is still not supported in enough browsers to be useful.
I would also like to save this the video stream to be viewed later within the app.
Things get somewhat confusing as I try to narrow down a solution to make this work.
I would like to get some recommendations on proven solutions out there to make this work on desktop and mobile? Any hints would be great.
I'll make a quick suggestion based on the limited details. I would use ffmpeg to encode to HLS. This format will playback natively on iOS and safari on Mac. For all other platforms, either provide an rtmp stream with a flash front end, or use jw player 6 commercial version that can play HLS. Or use a wowza server to handle this all for you.

HTML5 audio codecs, support for other formats besides OGG

I started tinkering around with HTML5 recently and am very interested in Audio tag. Though one thing immediately came to my attention, it appears that OGG is the only format supported in Firefox!
I understand that this is because MP3 and other codecs are proprietary software and require a license to use. But how is HTML5 Audio (and Video) going to catch on if you can only use OGG?? Like it or not the world is currently hooked to MP3 or Apple's AAC.
And to further hinder things it seems that Mozilla is all for only natively supporting OGG.
So I'm curious. Why can't Firefox, and other browsers, use System installed codecs for playing media? Why do they have to be installed into the browser, or depend on technologies like Flash? It just seems like bad design to me.
Perhaps, I'm just naive about this, but this whole codec war is just a nuisance. Can anyone point me to information, laws, and other information regarding why browsers can't use system installed codecs? I would also like to hear other users opinions about this as well. Thanks!
Edit In case this comes off as not being a programming question, I want to clarify that to me this issue directly affects web programmers. For example, in the case of audio support, do we have to use flash? or will these issues eventually be handled in HTML5. Where does it seem that things are going, technology-wise, in regards to this issue?
The <object> or <embed> tag could be used to reference any media, and allowed to transcend the browser for handling. The purpose of HTML5's media is to somewhat standardize a common encoding (lowest common denominator if you will) across platforms and have a browser-native player. The Ogg container and various encodings make perfect sense.

How to intercept and apply effects to Firefox audio/sound output

I want to build a Firefox extension that will allow me to directly manipulate the audio output, applying live filters and effects, from (for example) a streaming video site. Im struggling to find any good resources to help me. I think the effects bit will be ok but I need to find a way of intercepting the audio stream output. Does anyone know if this is possible?
Thanks,
Tom
Sorry, it's not possible. I don't know of any web browsers that expose an interface that lets you manipulate audio output.
Keep in mind that a lot of audio output comes from plug-ins like Flash, and those plug-ins are sending the audio output directly to your operating system - they're not even routing it through your web browser. So it wouldn't be possible for your web browser to intercept the sound if it wanted to.

Is there any good way to get microphone audio input to server using just a web browser?

I need to get audio input from users having just browsers (and not only IE). Is there any good way to get audio stream with a brouser to server.
If it possible to avoid Java, flash etc?
Thanks.
You could do it with flash.
Have a look at http://livedocs.adobe.com/flash/9.0/main/wwhelp/wwhimpl/common/html/wwhelp.htm?context=LiveDocs_Parts&file=00000297.html
P.S.: Html alone can't do it. You would need a plugin.
No. You need a plugin of some sort. HTML itself doesn't support microphone input.
-- EDIT
As answered a few secs before me, you can use many third party plugins, such as flash and java. But you cannot avoid not using them.

Resources