I have a video streaming server that provides an HTTP API for live video streams. A stream is sent as multipart/x-mixed-replace so each video frame is delimited with a certain boundary string like --DigifortBoundary for instance. Also each frame comes with its own Content-Type header which, according to this particular streaming server's documentation, can be one of these:
image/jpeg
image/wavelet
video/mpeg
video/h263
video/h264
Example of a stream:
--DigifortBoundary
Content-Type: image/jpeg
Content-Length: 35463
JPEG_DATA
JPEG_DATA
..
..
..
JPEG_DATA
--DigifortBoundary
Content-Type: image/jpeg
Content-Length: 34236
JPEG_DATA
JPEG_DATA
..
..
.. JPEG_DATA
The problem is, I need to embed a video player in an HTML page but I could not find any player that supports the multipart/x-mixed-replace content type or even streaming via HTTP. I know the flash video players out there usually support RTMP or RTSP, but I've never heard of a player that supports HTTP video streaming.
Do you know any web video player that can do it?
on the client side VLC and Firefox can doit - probably lots more.
On the server side:
http://en.wikipedia.org/wiki/Motion_JPEG#M-JPEG_over_HTTP mentions three:
MJPG-Streamer: http://sourceforge.net/projects/mjpg-streamer/
ffmpeg-server as part of ffmpeg http://www.ffmpeg.org/
cambozola http://www.charliemouse.com:8080/code/cambozola/
This is what I personally found out:
MJPG-Streamer will only compile on linux flavors (does not compile e.g. on MacOS X)
ffmpeg-server on ubuntu if installed with apt-get install is likely outdated and buggy
Cambozola seems to be more like a standalone client
Related
hi guys i want to make app like obs to stream video to live video web service (twitch,youtube) with rmtp portocol with node js lang
the website give me this two tokens to make live :
Stream URL:
rtmp://rtmp.cdn.asset.aparat.com:443/event
Stream Key
ea1d40ff5d9d91d39ca569d7989fb1913?s=2f4ecd8367346828
I know how to stream from obs but i want to stream music from server 24 hours
ill happy to help me with libs or examples
if theres other langs to do this work pleas tell that
thank you
my workflow is:
Using ffmpeg I send an RTMP stream to WOWZA App1.
App1 sends the stream to an internal 2nd app(App2).
App2 transcodes and packetize to hls and is the origin for Cloudfront distribution.
Cloudfront serves the streams to users.
The player on users is based on HLS.js.
To prepare for differents scenarios I forced App2 to restart during a test transmission, in this case App1 is still receiving stream from ffmpeg and trying to send it to App2, after app2 is ready this link is established again and App1 continues sending the stream to App2, but there is no video on client side.
Before restart, chunklist.m3u8 lists many chunks until the 17th: media-u3510ez40_17.ts
Then, while App2 is restarting chunklist.m3u8 do not exist and cloudfront send 404 error.
And then, when App2 is back, chunklist.m3u8 lists a new list of chunks starting at 1 with a new id: media-u1ofkjj9w_1.ts
The problem is that there is no video and network traffic shows that the browser do not downloads the new listed chunks.
The chunklist.m3u8 keep adding new chunks but the browser do not download any of this... until appears the 18th Chunks... and the video restart.
I try many times and the problem is the same, before restart the last chunk has a number N, and after restart there is no video until reach de N+1 Chunk, but the Ids are different.
I don't know if this issue is on Wowza, Cloudfront or the HLS.js player :/
chunklist.m3u8 Before Restart:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:9
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-DISCONTINUITY-SEQUENCE:0
#EXTINF:8.333,
media-u3510ez40_1.ts
#EXTINF:8.333,
media-u3510ez40_2.ts
#EXTINF:8.334,
.
.
.
media-u3510ez40_16.ts
#EXTINF:8.333,
media-u3510ez40_17.ts
chunklist.m3u8 After Restart:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:17
#EXT-X-MEDIA-SEQUENCE:1
#EXT-X-DISCONTINUITY-SEQUENCE:0
#EXTINF:16.396,
media-u1ofkjj9w_1.ts
#EXTINF:8.333,
media-u1ofkjj9w_2.ts
.
.
.
media-u1ofkjj9w_16.ts
#EXTINF:8.333,
media-u1ofkjj9w_17.ts
#EXTINF:8.333,
media-u1ofkjj9w_18.ts
You need to set cupertinoCalculateChunkIDBasedOnTimecode property to true in Add Custom Property section of the Wowza Streaming Engine application. Refer this: https://www.wowza.com/docs/how-to-configure-apple-hls-packetization-cupertinostreaming
Also, note that it will help in the case of encoders that send synchronized timecodes to Wowza. For encoder that doesn't send synchronized timecodes, I suggest to implement an absolute timecode irrespective of encoder send it or not. This will help the application to recover from the N+1 chunk number after the restart.
The below page will help you to configure it properly.
http://thewowza.guru/how-to-set-stream-timecodes-to-absolute-time/
About the session-id change part, When using Wowza as origin for Cloudfront distribution, you need to enable httpOriginMode and disable httpRandomizeMediaName to make it work properly. Below Wowza doc will help you set up it properly, https://www.wowza.com/docs/how-to-configure-a-wowza-server-as-an-http-caching-origin
I am trying to play an HLS stream (.m3u8) containing .ts file buffers.
It was playing all right before when i tried with some open sources. Now when i am trying to play with the streams provided by my service provider, It does play in all major browsers except chrome.
P.s: I am using videojs for accomplishing these. I also tested using viblast but no luck there.
For reference i am posting my code :
<script>
//myplayer is my <video> object for videojs
myPlayer.src({
type: "application/x-mpegURL; application/vnd.apple.mpegurl",
src: encodeURI(some m3u8 link)
});
myPlayer.play();
</script>
Now this code here is ruling all browsers but when it faces chrome it kneels down.
The error response from chrome is like below :
VIDEOJS: ERROR: (CODE:3 MEDIA_ERR_DECODE) The media playback was
aborted due to a corruption problem or because the media used features
your browser did not support.
MediaError {code: 3, message: "The media playback was aborted due to a
corruption…media used features your browser did not support."}
Note: I am getting my streams from scaleEngine.
I came across this error when using an Mp4 with a Webm fallback. Everything worked well in Firefox but I was getting this error in Chrome. I switched the order of fallbacks so that videojs used Webm first and Mp4 as a fallback. This fixed it for me at least.
Error is : The content type text/html of the response message does not match the content type of binding (application/soa+xml; charsset=utf-8), if using custom encoder ,be sure that the IsContentTypeSupported method is implemented properly. the first 1024 bytes of response were:' .... N So on ......
SOAP expects XML back and not HTML, therefore you get the error.
In my experience, you get HTML instead of XML when there's an error on the camera and it sends back some kind of description of the error.
To be able to understand what's going on, you should install wireshark, sniff the traffic and check what the camera sends back.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).