when video or audio is played from a uri is it streamed or downloaded fully and played? - audio

I have a content creation site I am building and im confused on audio and video.
If I have a content creators audio or video stored in s3 and then I want to display their file will the html video player or audio player stream the media or will it download it fully then play it?
I ask because what if the video or audio is significantly long. like 2 hours for example. I need to know how to solve the use case.
Lastly what file type is most acceptable for viewing on webpages? It seems like MPEG-4 is the best bet. Is that true?

Most video player clients and browsers will attempt to stream the video if they can.
For an mp4 video file hosted on a server, so long as the header is at the start and the server accepts range requests, this will mean the player downloads the video in chunks and starts playing as soon as it has enough to decide the first frames.
For more professional streaming services, they will generally use an adaptive bit rate streaming protocol like DASH or HLS (see this answer: https://stackoverflow.com/a/42365034/334402) and again the video will be streamed in chunks, or segments, and will start playing while it is streaming.
To answer your last question you need to be aware that the raw video is encoded (e.g. h.264, VP9 etc) and the video, audio, subtitle etc tracks stored in a video container (e.g. mp4, Web etc).
The most common format is probaly h.264 encoded and mp4 containers at this time.
The particular profile for h.264 can matter also depending on the device - baseline is probably the most supported profile at this time. You can find examples of media support for different devices online, e.g. for Android: https://developer.android.com/guide/topics/media/media-formats

#Mick's answer is spot on. I'll just add that mp4 (with h264 encoding) will work in just about every browser out there.
The issue with mp4 files (especially with a 2 hour long movie) isn't so much the seeking & streaming. If your creator creates a 4K video - thats what you'll deliver to everyone (even mobile phones). HLS streaming on the other hand has adaptive bitrates - where the video adapts to both the screen & the available network speeds. You'll get better playback results with less buffering (and if you're using AWS - a LOT LESS data egress) with video streaming.
(there are a bunch of APIs and services that can help you do this - including api.video (where I work), Mux and others).

Related

How to play live audio stream using Google Actions Dialogflow

I have been trying to find a way to play live stream of audio (mp3) using Google Actions but haven't found a way to do so.
I tried Media Response as well but as mentioned in the documentation it doesn't support live stream.
I followed this thread but it doesn't have any examples to help me with.
Is it possible to play live mp3 stream using Google Actions?
I've had relatively good results with the Media Player being able to handle mp3 "streams". There are a couple of problems doing this, however:
There is a time limit on the audio playback (4 hours last time I checked, but it may have changed).
There isn't any such thing as an mp3 "stream". The player treats it as a single mp3 file that it downloads in chunks using HTTP headers, unlike some of the streaming protocols that allow for varying bitrate based on network and other conditions.
If this is an issue, one alternative might be to use the Interactive Canvas (which uses Chrome on the device) to present an HTML page that has an <audio> tag in it that you control. This gives you a little more control (most streaming protocols are either supported or have JavaScript libraries that can do the work), but there are some downsides:
This will only work on Smart Displays and Android. Smart Speakers aren't supported.
Interactive Canvas is only allowed for certain types of Actions. Currently it must be a game, a story, or an educational Action.

Displaying mjpeg/h264 live streaming (with additional information) on a web page?

Right now my goal is to grab a streaming video from an IP surveillance camera and display it on a web page.
The camera allows to encode the streaming either in h264 or mjpeg, and transmits it by the RTSP protocol.
The streaming has to be available for several kinds of devices (mainly computers, android smartphones and iphones).
According to my findings it seems like the best option for doing that (in terms of latency) is to transmit the frames of the video through a websocket:
http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets.
Almost all the implementations of this mechanism I've found are based on mjpeg since it's easier to get the video frames.
There's also a h264 player: https://github.com/131/h264-live-player, based on https://github.com/mbebenita/Broadway, which I didn't manage to run ( I would appreciate any help in that respect).
Now the first question is: it is worth trying to work with h264 (since it saves a lot of bandwidth). Or would the h264 decode process probably introduce too much latency?
I would also like to ask if anyone knows a better solution that the one I'm trying to implement.
Finally, where I say "additional information" I mean that I might want to include some additional data associated with some video frames. (something like subtitles or telemetry data).

Windows Azure live media encoders provide live transcoding?

I have a simple question - I want to stream live video + audio. I would like to use Windows Azure for that (mainly because it seems to provide HLS with AES protection which I have not encounted in opensource solutionsand clear for managers pricing per streaming user) I amtrobuled because of next quote:
Currently, Media Services does not provide a live transcoding service.
You can use one of the following third party live encoders that output
RTMP or Smooth Streaming formats: Elemental, Envivio, Cisco, RGP
encoders output Smooth Streaming; Adobe Flash Live, Wirecast and
Tredek encoders output RTMP.
And a few lines after
You can deliver your live stream in any of the following formats:
Smooth Streaming, DASH and HLS. When doing live streaming, HLS is
packaged dynamically and the default HLS packaging ratio is 3 Smooth
fragments to 1 HLS segment (3:1).
...
Configure a live transcoder.
Every time you reconfigure the transcoder, call the Reset method on
the channel.
So no transcoding is provided yet I shall set up a transcoder... What? How?
In FFmpeg there are 2 types of transcoding
from one encoded data format to another (say PCM raw data to encoded MP3 frames)
from one frame/packet type to another (say MP4 frames of already encoded audio/video to FLV frames format with same encoded data in them)
Do they try to tell me that they provide frames repacking from RTMP to HLS yet no live encoding into another compression type (say from Speex audio to AAC)?
As I answered on your another post, you can use tool like Wirecast 6 to encode your live stream and push the stream into Azure Ingest URL. We will give you a publish URL that could dynamically package content into HLS, Smooth Streaming and DASH.
For more information, please refer to this post: http://azure.microsoft.com/blog/2014/09/10/getting-started-with-live-streaming-using-the-azure-management-portal/
Yes. The second type of transcoding you describe can be better named transpackaging because no video coding is done.
Transcoding is not provided. Transpackaging is provided.

MKV, MP4, or FLV for web video streaming

I'm currently on edge with what container I should use for the videos I put on my website.
I recently started uploaded videos of game play/walkthroughs and saw the need for a container that could hold HD video without limitations on file size, codecs (AAC or AVC), or resolution (in the future I want to be able to support 5K video) and 5.1 Dolby digital and up audio. Of course I don't expect the 5K to be efficient at being streamed, I just want it to be available.
This is where the confusion started.
I currently use the .flv container because people state it is all around better. Less resource consumptive, widely used, and supports the common codecs. The problem with this is simple. It cannot support the HD content I want to show: 5.1 dolby audio and limitless file size.
MP4 is everything I need, but I heard that it can be slow to respond, pseudostreaming modules are not widely accepted by browsers, and I don't have time to change containers everytime someone wants to update to .mp5, 6, 12, etc.
That's where I am including .mkv as the container. .MKV also supports everything I want (HD, 3D), all codecs, universal, and limitless file attributes. THE ONLY problem is that it cannot be streamed.
I know this is a programmers site, but may be in the future, being that we can only advance web connections, I or someone else could program a module for apache .mkv streaming. I'm don't know where an apache module source is, so I cannot do it at this time.
I leaning between .flv and .mkv. I'm not really concerned about .mp4 because if I want to be future-proof I need .mkv, if I'm not concerned about the future or updates I should stay with .flv.
What do you all think. Would it really be so difficult to program a .mkv streaming module?
Excluding web streaming, which of the 3 would be all around better. Video quality (AAC AVC), file size limits, universal, web support, etc.
Thanks,
You can use the window media streaming platform. After that they will look after your every problem. However ,MP4 with h264 video and aac audio and streamed/played with flash is also good.

RTSP live streaming from IP Camera

Is it possible to see the live stream of an IP camera using RTSP ?
Example URL: rtsp://public ip:554/1363e66e.mp4
The encoding is mp4 h.264 baseline profile at 320 x 240 resolution.
I followed the Wiki link here.
But I get the error: Prefetch error -2
When I try to play using real player on the nokia e72, I get the error: 'General: System Error'.
Please let me know what I can do about this.
There are no video players on Ovi store that can play the stream either but I am able to play the stream on VLC on the desktop.
You can stream it using ReaPlayer if you don't have VLC player in Ovi store. See the port address range supported by your IP camera. Try the range of 1024 - 2000. RTSP supports VLC, Quicktime and Real player. Using any of these objects you can stream it.
So I think here is the case,
There are a few different mp4 containers. Standard one wont allow you to wrap a real time data into a mp4 container because mp4 needs to have a field/atom in its header called
MDAT and it has information about the file and its size as well.(which is written after the file is completely encoded. )
So unless you wake that you can not stream live stuff in mp4 UNLESS it is fragmented mp4.
Media Foundation will allow you to do this when windows 8 is out( i got the intel from the msdn forum so I dont know how true it is).
I dont know what ffmpeg/Gstreamer is capable of. Again if this is a commercial product you are working on you might run into some licensing issues with ffmpeg.
Look at webrtc.
I am guessing best bet it to use webm or ogg/theora but I am not sure if theora can do what you want, This is something I am also working on.
Please share your findings
Thanks.

Resources