How to send data (input stream type - not video source ) to rtsp server - rtsp

I write program to receive data from webcam, afterward I forward it to video frames type and add it to application client - Jframe in Java.
But now I want to send this frames to rtsp server - with purpose live streaming to other client (same VLC).
I know that if input is video source so simple, but this way is video frames type.
So could you help me how to send this data to server RTSP or introduce to me what's server can resolve this issue.
Thanks & Best Regard.

Related

Accessing live video stream midway using websockets

I am using the combination of fragmented mp4 and websockets to stream a live video stream to web browser from where MSE takes over.
I have successfully fragmented into the appropriate fmp4 format using ffmpeg and have checked the data using an mpeg4parser tool.
Utilising a websocket server, the incoming data is broadcasted to all the browser clients connected via websocket. This works fine for both playback and live streaming(using rtsp stream as the input).
The problem I am facing occurs when a client tries to access the stream midway, i.e, once the ffmpeg stream has started. I have saved the init segment(ftyp + moov) elements in a queue buffer in the websocket server. This queue buffer sends this data to each new client on connection.
I believe this data is sent correctly since the browser console does not throw the 'Media Source Element not found' error. Yet no video is streamed when it receives the broadcasted moof/mdat pairs.
So a couple of questions I would like the answer to are:
1) I have observer that each moof element contains a sequence number in it's mfhd child element. Does this have to start from 1 always, which will naturally not be the case for a video stream accessed midway?
2) Is it possible to view the data in the browser client.js. At present all I can view is that my mediaBuffer contains a bunch of [Object ArrayBuffer]. Can I print the binary data inside these buffers?
3) From the server side the data seems to be sent in moof/mdat fragments as each new data arriving from the ffmpeg output to the websocket server begins with a moof element. This was noticed by printing the binary data in console. Is there a similar way to view this data in client side.
4) Does anyone have an idea of why this is happening? Some fragmented mp4 or ISO BMFF format detail that I am missing.
If any further detail is required for clarification please let me know, I will provide it.
Make sure your fragments include a base media decode time. Then set the video tag 'currentTime' to the time of the first fragment received.

Converting camera input to an RTMP stream for a live streaming function within an existing web application

I need to be able to take the input video from a client side camera and convert it to an RTMP stream to be viewed live by other clients as part of a website application I am building. What libraries exist that could help facilitate this?

Save RTSP video frames to a file

I am able to play rtsp stream on web page using live555 server but I need to extract the frames from the rtsp stream and store them in a file.
Can anyone guide me how to do this?
If you are already familiar with live555 try use simple RTSP client application which comes with live555 package. Application called openRTSP and located in testProgs folder. It could ready input RTSP stream and save to file.

Rendering Audio Stream from RTSP server

i have an RTSP server which is re-streaming the A/V stream from the camera to clients.
Client side we are using MF to play the stream.
I can successfully play the video but not able to play the audio from the server. However when i use vlc to play, it can play both A/V.
Currently i am implementing IMFMediaStream and have created my customize media stream. I have also created a separate IMFStreamDescriptor for audio and added all the required attributes. When i run , everything goes fine but my RequestSample method never gets called.
Please let me know if i am doing it wrong or if there is any other way to play the audio in MF.
Thanks,
Prateek
Media Foundation support for RTSP is limited to a small number of payload formats. VLC supports more (AFAIR through Live555 library). Most likely, your payload is not supported in Media Foundation.

how to create a RTSP streaming server

So I am trying to create a RTSP server that streams music.
I do not understand how the server plays a music and different requests get what ever is playing at that time.
so, to organize my questions:
1) how does the server play a music file?
2) how does the request to the server look like to get whats currently playing?
3) what does the response look like to get the music playing in the client that requested the music?
First: READ THIS (RTSP), and THEN READ THIS (SDP), and then READ THIS (RTP). Then you can ask more sensible questions.
It doesn't, server streams little parts of the audio data to the client, telling it when each part is to be played.
There is no such request. If you want, you can have URL for live streaming, and in RTSP DESCRIBE request, tell the client what is currently on.
Read the first (RTSP) document, all is there! Answer to your question is this:
RTSP/1.0 200 OK
CSeq: 3
Session: 123456
Range: npt=now-
RTP-Info: url=trackID=1;seq=987654
But to get the music playing you will have to do a lot more to initiate a streaming session.
You should first be clear about what is RTSP and RTP. The Real Time Streaming Protocol (RTSP) is a network control protocol designed for use in communications systems to control streaming media servers. where as Most RTSP servers use the Real-time Transport Protocol (RTP) for media stream delivery. RTP uses UDP to deliver the Packet Stream. try to Understanding these concepts.
then Have a look at this project.
http://sourceforge.net/projects/unvedu/
This a open source project developed by our university, which is used to stream video(MKV) and audio file over UDP.
You can also find a .Net Implementation of RTP and RTSP here # https://net7mma.codeplex.com/ which includes a RTSP Client and Server implementation and many other useful utilities e.g. implementations of many popular Digital Media Container Formats.
The solution has a modular design and better performance than ffmpeg or libav at the current time.

Resources