Using streamlink to stream rtsp through vlc. When i pause the stream and then play issue - libvlc

Using streamlink to stream rtsp through libvlc. When i pause the stream and then play, it resumes where i paused from. How do i go back to live broadcast after i pause? Right now all i can think of doing is restarting the stream.
"sout=#transcode{vcodec=theo,vb=800,scale=1,width=600,height=480,acodec=mp3}:http{mux=mp3,dst=127.0.0.1:8080/desktop.mp3}",
":no-sout-rtp-sap",
":no-sout-standard-sap",
I tried to do it but it continues
Thanks in advance for any assistance.

Related

Node.js Video Stream WEBM Live Feed to HTML

I have a node.js server that's receiving WEBM blob binary data small packets through socket.io from a Webpage!
(navigator.mediaDevices.getUserMedia -> stream -> mediaRecorder.ondataavailable -> DATA . I'm sending that DATA back to the server. So that includes timestamp and binary data).
How do I stream those back on a http request in a never ending live stream that can be consumed by a HTML webpage simply by adding the URL in the VIDEO tag?
Like this:
<video src=".../video" autoplay></video>
I want to create a live video stream that and basically stream back my Webcam to an html page but I'm a bit lost how do I do that. Please help. Thanks
Edit: I'm using express.js to serve the app.
I just am not sure what I need to do on the Server with the coming webm binary blobs to serve it properly to be consumed by an html page on an endpoint /video
Please help :)
After many failed attempts I was finally able to build what I was trying to:
Live video streaming through socket.io.
So what I was doing was:
Start getUserMedia to start the web camera
Start a mediaRecorder set to record intervals of 100 ms
On each available chunk emit an event through socket.io to the server with the blob converted to base64 string
Server sends back base64 converted 100ms video chunk back to all connected sockets.
Webpage gets the chunk and uses mediaSource and sourceBuffer to add the chunk to the buffer
Attach the media source to a video element and VOILA :) the video would play SMOOTHLY. As long as you attach each chunk in order and you don't skip chunks (in which case it stops playing)
And IT WORKED! BUT was unusable.. :(
The problem is the mediaRecorder process is CPU intensive and the page cpu usage was jumping to 15% and the whole process was TOO SLOW.
There was 2.5 seconds latency on the video stream passing through socket.io and virtually the same EVEN if DON'T send the blobs through socket.io but render them on the same page.
Sooo I found out this works but DOESN'T work for a sustainable video chat service. It's just not designed for it. For recording a webcam video to playback later, mediaRecorder can work but not for live streaming.
I guess for live streaming there's no way around WebRTC, you MUST use WebRTC to send the video stream to either a peer or a server to send to other peers. DO NOT TRY to build a live video chat service with mediaRecorder. You're only gonna waste your time. I did that for you :) so you don't have to. Just look into webRTC. You may have to use a TURN server. Twilio provide STUN, TURN servers but it costs money. BUT you can run your own TURN server with Coturn and other services but I'm yet to look into that.
Thanks. Hope that helps someone.

How stream multi playing audio in the same time

I want to use nodejs to stream multiple audios which is auto playing in the background. Meanwhile, someone could connect to the stream and listen to it. The scenario is like radio station.
I guess i could use websocket to broadcast to the client, but how to pass the playing audio data or is there any module I can use.

How to pause and play a J2ME audio player streaming with RTP

Here is how I try to play an audio streaming from a server over RTP.
try {
String url = "rtp://...";
Player p = Manager.createPlayer(url);
p.realize();
VideoControl video = (VideoControl) p.getControl("VideoControl");
Item itm = (Item) video.initDisplayMode(VideoControl.USE_GUI_PRIMITIVE, null);
midlet.form.append(itm);
p.start();
} catch (Exception e) {...}
I tried http and it worked well. In http, all the media content download and then we can play, pause and play again. It is ok. I want to play an audio from RTP. I want to know how to pause the player (and data is not downloading, keep a record where the media paused) again play when user needs to play (and start downloading again from the last point downloaded (not from beginning)).
As far as I know, smartphones cannot keep a session with the server as the mobile phone doesn't keep sessions and send back to the server every time a request is sent to the server (and just only send a request and get the response, no session management). Maybe I am wrong.
Anyway how can I pause (and stop downloading) and play again (start downloading from the last point where downloading stopped) an audio in a J2ME application? Please let me know if any one know a good solution, sample code.
When streaming, the entire media is NOT downloaded at once,rather portions of the media are downloaded as playback progresses. When a player is created with rtp/rtsp locator, calling player.stop() is enough to stop playback and any further download of the media.Likewise, player.start() is enough to resume playback from where it was stopped.
You must bear in mind that since the media is not local if a stream is not resumed after a while the streaming server may consider a stream to have timed out and you would need reestablish the stream.
Just follow player.stop(),player.start(); function enough

mp3 http streaming : recording and playing simultaneosly

I have a server (linux) program that generates audio files (mp3). What
I need is to broadcast these files using http stream. The tricky part
is that the broadcast starts when the file to be transmitted is not
fully generated.
I tried to do this using mpd+mpc but once I use the "mpc play" command
only already existing part of the file is buffered and transmitted,
and the player disregards the part that appears after beginning of
playback.
Is there any way to send a mp3 http stream (using mpd or any other
server-side player) so that the player won't stop the playback as it
reaches the end of the part that was buffered initially?
Any ideas, please.
http://streamripper.sourceforge.net/ can record and broadcast the same stream
shotcast(or icecast, dont remember) was designed especially for this, and could re-encode your stream on the fly

WP7 audio stream problem

I'm using MediaElement to play audio mp3 stream,
everything goes ok, but now I have mp3 stream that does not end with .mp3,
( http://server2.fmstreams.com:8011/spin103) and I'm getting
AG_E_NETWORK_ERROR
I found solution to add ?ext=mp3, but it didn't work for me, any ideas?
If you are streaming live radio, the stream may be encoded by an IceCast server or ShoutCast server. To read these streams, you will need to decode the stream in memomry and pass it to the MediaElement once it has been decoded.
have a look at Mp3MediaStreamSource : http://archive.msdn.microsoft.com/ManagedMediaHelpers
and Audio output from Silverlight
I lost tons of time on this, and this is the best solution I found so far.
You also have to be sure that while you are testing, the device must be unplugged from the computer.

Resources