I've a scenario where am having a centralized server which runs on linux and it needs to stream audio and video asynchronously to say 100 clients. The input to the server is received from another server. The question on context is "Is there a way to stream the audio independently to each of the clients?"
Note: 1. Video and Audio is unique for each clients.
Related
I have an interesting problem, in short: how to share information between threads in NodeJS (12+).
The tech stack - in short also:
A remote/online streaming server, what producing an MP4 live stream
A client application what only consumes live view through RTSP over HTTP
A small NodeJS based application to get the MP4, transform it and pipe it back to the client.
.
The modules what I use:
NodeJS 12+
Request/fetch/https module
Express module
Stream module
The story:
I have an application, what has a gateway/relay role between two different system. One provide a live media stream (simple MP4(h264) stream) and another one supposed to consume it as RTSP over HTTP. The weird part is, the consumer client does not behave like any other player (like VLC or a webplayer), sometime - seemingly randomly - resend the request, sometime close the current request and resend it. So direct pipe not really working for this use-case.
I made a worker (from worker_threads), what hold a readable stream object, and when the client hit the request, I start populate the MP4 stream into the readable object in the worker, so even if the stream is does get a close or resend, it will not break the live media stream consuming process.
And wherever the client connect, I just would like to pipe the readable object for it.
Originally, I though a simple pipe from like request/fetch/http.get or FFMPEG would be enough, but the client could call the call between 3 seconds and 2 minutes.
.
So, my questions are, what could be the best solution, to pass back the data from the worker to the main and let reach the HTTP routing?
I had some idea like:
I know, I can have my own channel between the threads and can pass back-and-forth information, but waiting for message and keep up the process does block the app, as far as I know (worker.on('message', (stuff) => {});).
Using Socket.io to pass data back from the worker, populate the readable in the main, and pipe the readable at http level (fake shared object basically)
Creating a secondary http server what offer the media stream, then i will just relay this into the response (e.g.: gatewaying/proxying)
Looking up some proxy solution where I can just simply redirect and reshape thing, like the input mp4 transforms into RTSP stream and pipe it to the consumer response
Should I just "remember" to the active stream, and if its streamed by the remote server, always just using the same url, passing to FFMPEG and continue piping to the res?
Note:
I setted up all the headers to keep alive the connection, but seems the client software act as-is.
By default its using RTSP and RTP/TCP to consume video stream, but has option for RTSP over http.
Probably I overlook some trivial task for serving RTSP video from a remote live MP4, but I did not found any good example or source anywhere (everywhere the same 3 article re-shared basically)
I did not found any similar question, nor article anywhere (but checked out the nodejs ffmpeg play video at specific time and stream it to client).
I'm trying to stream video over a websocket and process it server-side with Node.JS. The client is reading from a video file (.mp4) and sending it over the web socket via a stream object. However, I'm having trouble extracting frames from the stream at the server so it can be processed by opencv.
Do I need to break the video up into frames and stream each individual frame? What format can opencv most easily process in real time?
The end goal here is to enable opencv to process each frame of a video (in real time) that is being received by the server. Think I'm having some trouble understanding the paradigm here.
I write program to receive data from webcam, afterward I forward it to video frames type and add it to application client - Jframe in Java.
But now I want to send this frames to rtsp server - with purpose live streaming to other client (same VLC).
I know that if input is video source so simple, but this way is video frames type.
So could you help me how to send this data to server RTSP or introduce to me what's server can resolve this issue.
Thanks & Best Regard.
Wanted to clear a few questions about websocket.
Is it possible to stream videos from server to client and client to server at the same time...something like video calling?
Can the server stream two videos to a single client at a time?
Regarding the first question, yes you can. There are already wrappers that simplify that task such as BinaryJS.
As per your second question, it would require a little extra configuration. Once a bidirectional link between client and server is established, the client will treat every incoming message as part of the same stream. Separating or multiplexing two videos in the same stream would have to carry another mark to help the client separate it.
It would be a better idea to open a new connection (with the same server) to stream a second video.
So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.