I stream any video for my users on the Lan network through Flash Media Server4.5 and my users can to view this videos on HTTP live stream. ([alink]http://192.168.1.10/live-video1.f4m[/alik]).
Now I need to save and archive this videos after unpublished until users can to play on-demand videos.
please advise me.
many thanks
.f4m suggests that you are using HTTP Dynamic Streaming (HDS) rather than HTTP Live Streaming (HLS). The most popular way to download HDS streams is to use this script: https://github.com/K-S-V/Scripts/wiki
Related
I don't know how to get started with this.
What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.
I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?
also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?
It depends on which live streaming protocol you want to play on the player, as #Brad said, HLS is the most common protocol for player.
Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.
The stream flow is like this:
FFmpeg/OBS ---RTMP--->--+
+--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+
The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):
For RTMP, you could use FFmpeg or OBS to push stream to your media server.
If want to push stream by H5, the only way is use WebRTC.
The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.
I have a node.js server that's receiving WEBM blob binary data small packets through socket.io from a Webpage!
(navigator.mediaDevices.getUserMedia -> stream -> mediaRecorder.ondataavailable -> DATA . I'm sending that DATA back to the server. So that includes timestamp and binary data).
How do I stream those back on a http request in a never ending live stream that can be consumed by a HTML webpage simply by adding the URL in the VIDEO tag?
Like this:
<video src=".../video" autoplay></video>
I want to create a live video stream that and basically stream back my Webcam to an html page but I'm a bit lost how do I do that. Please help. Thanks
Edit: I'm using express.js to serve the app.
I just am not sure what I need to do on the Server with the coming webm binary blobs to serve it properly to be consumed by an html page on an endpoint /video
Please help :)
After many failed attempts I was finally able to build what I was trying to:
Live video streaming through socket.io.
So what I was doing was:
Start getUserMedia to start the web camera
Start a mediaRecorder set to record intervals of 100 ms
On each available chunk emit an event through socket.io to the server with the blob converted to base64 string
Server sends back base64 converted 100ms video chunk back to all connected sockets.
Webpage gets the chunk and uses mediaSource and sourceBuffer to add the chunk to the buffer
Attach the media source to a video element and VOILA :) the video would play SMOOTHLY. As long as you attach each chunk in order and you don't skip chunks (in which case it stops playing)
And IT WORKED! BUT was unusable.. :(
The problem is the mediaRecorder process is CPU intensive and the page cpu usage was jumping to 15% and the whole process was TOO SLOW.
There was 2.5 seconds latency on the video stream passing through socket.io and virtually the same EVEN if DON'T send the blobs through socket.io but render them on the same page.
Sooo I found out this works but DOESN'T work for a sustainable video chat service. It's just not designed for it. For recording a webcam video to playback later, mediaRecorder can work but not for live streaming.
I guess for live streaming there's no way around WebRTC, you MUST use WebRTC to send the video stream to either a peer or a server to send to other peers. DO NOT TRY to build a live video chat service with mediaRecorder. You're only gonna waste your time. I did that for you :) so you don't have to. Just look into webRTC. You may have to use a TURN server. Twilio provide STUN, TURN servers but it costs money. BUT you can run your own TURN server with Coturn and other services but I'm yet to look into that.
Thanks. Hope that helps someone.
Good day! I'm a newbie on video streaming. Can you help me find good ways on how to make a video streaming secure?
I'm having some issues on my video hosting project security.
I am creating a web page which calls a video stream hosted on a different server where
my web page is deployed.
Server 1(web page video embed) calls video to stream on Server 2(video host).
The problem is that they are hosted on an absolute different network. Should Server 2 where the video is hosted should be private and only allow Server 1 to fetch the video stream creating a server to server transfer of data, or should it be public for the clients to be able access it.
Can you help me decide what to do to secure my videos?
I badly need some idea on this... thanks guys!
How are you streaming and what streaming protocol are you using?
Server to server wont help in securing the video.it is better to stream the video direcly from your Server 2(video host) directly to the client,so that it wont be overhead for server 1(web page video embed).You need to use secure way to protect you video on server 2.if the server2 is not secure,even if you stream through server1 it wont help.
Here are details of security level on different video streamings.
If you are using progressive download.This can be done using normal http protocol.In this approach you would be able to see the video url in the browser.Once you got the url you can download it as a normal file download.Security is very low here.Even if you sign the video url,the user can download the video easily.
Streaming,you can stream the video using different protocol like rtmp etc.If you are streaming videos using some rtmp.In this approch, you wont be able to download the video directly,but you can use some good software to capture the video stream and save to the pc.
Streaming securly.There are some protocols like rtmpe.I tried only rtmpe,In this protocol,the streaming content will be encrypted on the server and decrypted on the client.so the software wont be able to capture the video stream.
Along with approach 3,if you sign the video url,it will add more security.Hope this helps.
I working on a system where we want to show a video stream from a Video Capture card in a browser. The browser connect towards a remote server and fetch a html page that have video in it. This video should be streamed from the client machine where a video capture card is connected.
On client side we running Linux and the capture card is registered as /dev/video0 by Video4Linux2. The browser on client side is Chrome (chromium-browser). On client side we have a webserver (lighttpd) that is possible to use for streaming.
I have looked into the getUserMedia API but it seems to be poor support for that right now. Other toughts that I have had is to use the local webserver or setup a streaming server on client side that stream video source locally.
Any ideas how to design this would be great input for me!
Thanks,
/Peter
Since Chrome does not yet support RT(S)P streaming for the <video> tag you will have to use a plugin for this.
Given it's availability I would suggest using Flash to write simple SWF which finds the correct video source and displays.
If needed you can use one of the many 'Recording Apps' available and strip out the recording part.
So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.