I've been looking for a solution to stream .avi video files for a while now and I can't find anything.
I found the Plex tool which allows to have a web interface of the media library. And precisely Plex allows the playback of .avi video on its web interface!
I saw that it uses blob:// so it's a file that is segmented I guess?
I was wondering if you have any idea how they do this magic?
Related
I have a content creation site I am building and im confused on audio and video.
If I have a content creators audio or video stored in s3 and then I want to display their file will the html video player or audio player stream the media or will it download it fully then play it?
I ask because what if the video or audio is significantly long. like 2 hours for example. I need to know how to solve the use case.
Lastly what file type is most acceptable for viewing on webpages? It seems like MPEG-4 is the best bet. Is that true?
Most video player clients and browsers will attempt to stream the video if they can.
For an mp4 video file hosted on a server, so long as the header is at the start and the server accepts range requests, this will mean the player downloads the video in chunks and starts playing as soon as it has enough to decide the first frames.
For more professional streaming services, they will generally use an adaptive bit rate streaming protocol like DASH or HLS (see this answer: https://stackoverflow.com/a/42365034/334402) and again the video will be streamed in chunks, or segments, and will start playing while it is streaming.
To answer your last question you need to be aware that the raw video is encoded (e.g. h.264, VP9 etc) and the video, audio, subtitle etc tracks stored in a video container (e.g. mp4, Web etc).
The most common format is probaly h.264 encoded and mp4 containers at this time.
The particular profile for h.264 can matter also depending on the device - baseline is probably the most supported profile at this time. You can find examples of media support for different devices online, e.g. for Android: https://developer.android.com/guide/topics/media/media-formats
#Mick's answer is spot on. I'll just add that mp4 (with h264 encoding) will work in just about every browser out there.
The issue with mp4 files (especially with a 2 hour long movie) isn't so much the seeking & streaming. If your creator creates a 4K video - thats what you'll deliver to everyone (even mobile phones). HLS streaming on the other hand has adaptive bitrates - where the video adapts to both the screen & the available network speeds. You'll get better playback results with less buffering (and if you're using AWS - a LOT LESS data egress) with video streaming.
(there are a bunch of APIs and services that can help you do this - including api.video (where I work), Mux and others).
I have a common use case scenario where I want to do the following
Upload an audio file. (wav/mp3)
Transcodes to 128k or 192k mp3.
Stores the audio asset.
Allows the audio asset to be streamed.
Supports streaming actions such as play pause and seek.
The documentation for azure media services seems like it might be able to support this but I am not too sure, seems like they focus on video content. Anyone have experience with this?
You can manage audio and encode audio only assets with azure media services.
WAV is supported input format/conatiner as a input asset. To see full list of supported formats check following link:
https://azure.microsoft.com/en-us/documentation/articles/media-services-media-encoder-standard-formats/
Check https://github.com/Azure/azure-content/blob/master/articles/media-services/media-services-custom-mes-presets-with-dotnet.md#audio_only to see audio only preset options which you will use to encode an audio only preset.
Approx we all know about gaana.com, and saavn.com, that website stream audio mp3 files to client side but does't allow to users to grab the audio files, actually we want to know what technology he used to stream the audio mp3 files.
is he using streaming server or or something else ?
Can you describe the technology he is using in steaming the audio files.
Actually we are also creating a web app where audio files will be streammed in client side and we also don't want to allow users to download our mp3 files like gaana.com or saavn.com.
and we are also curious about if we want to stream our audio mp3 files in three different quality the what should i do. Should we convert all the mp3 files in all the three different quality and upload to the server or is any another solution exist for this purpose.
If you want to code your own streaming server then you can use this link
https://pypi.python.org/pypi/DeeFuzzer/ it's a python based streaming server, or you can also use ffmpeg or even VLC
I've successfully implemented the OpenTok. In group discussion (Three people, including moderator) video and audio transmission works well, but problem starts when moderator record stream. Video clip downloaded from OpenTok server has no sound.
Does anyone have any idea what can be wrong?
Thanks to Ankur (OpenTok Forum):
The audio stream in the downloaded videos is in the SPEEX codec. Many
desktop audio players don't recognize the codec. You may need to use
ffmpeg to transcode the audio before it is playable in programs like
VLC.
Is it possible to see the live stream of an IP camera using RTSP ?
Example URL: rtsp://public ip:554/1363e66e.mp4
The encoding is mp4 h.264 baseline profile at 320 x 240 resolution.
I followed the Wiki link here.
But I get the error: Prefetch error -2
When I try to play using real player on the nokia e72, I get the error: 'General: System Error'.
Please let me know what I can do about this.
There are no video players on Ovi store that can play the stream either but I am able to play the stream on VLC on the desktop.
You can stream it using ReaPlayer if you don't have VLC player in Ovi store. See the port address range supported by your IP camera. Try the range of 1024 - 2000. RTSP supports VLC, Quicktime and Real player. Using any of these objects you can stream it.
So I think here is the case,
There are a few different mp4 containers. Standard one wont allow you to wrap a real time data into a mp4 container because mp4 needs to have a field/atom in its header called
MDAT and it has information about the file and its size as well.(which is written after the file is completely encoded. )
So unless you wake that you can not stream live stuff in mp4 UNLESS it is fragmented mp4.
Media Foundation will allow you to do this when windows 8 is out( i got the intel from the msdn forum so I dont know how true it is).
I dont know what ffmpeg/Gstreamer is capable of. Again if this is a commercial product you are working on you might run into some licensing issues with ffmpeg.
Look at webrtc.
I am guessing best bet it to use webm or ogg/theora but I am not sure if theora can do what you want, This is something I am also working on.
Please share your findings
Thanks.