Can I make media server using RTSP, MPEG2-TS over RTP? - rtsp

Using RTSP Protocol, Can I send MPEG2-TS RTP Packet?
In VLC Player, I can use MPEG-2 TS over RTP.
But It is only supporting RTP, not using RTSP
(requesting url => rtp://{server ip}:{port}/{path})

in the tutorial, https://support.spinetix.com/wiki/Tutorial:Streaming_using_VLC#Transcode_into_MPEG_TS_and_stream_over_RTP.2FTS
you should use rtsp as your source

Related

Is there a rtsp delivery server?

I want to make a rtsp delivery server.
rtsp publisher --> [rtsp delivery server] --> rtsp player
I know that the wowza server can achieve.
But I want to make it like a nginx without wowza.
If you have an information, let me know.
Your subject mentions RTSP but contents only RTMP.
For RTSP to RTMP, Wowza SE is required to input RTSP and re-stream as RTMP or directly HTML5 HLS to web pages.
https://www.wowza.com/pricing/streaming-engine
You can license Wowza SE and install on own dedicated server or get a turnkey plan for lower costs and hassle from $50/month:
https://webrtchost.com
For RTMP to RTMP, an option is Red5, free open source RTMP server, if you have a VPS or dedicated server.
If you don't have your own dedicated server to install, you can get a turnkey RTMP service. There are providers from as low as $9/month:
https://hostrtmp.com

Re-streaming RTMP to RTSP

I have some trouble with RTMP and RTSP protocol.
I have a Gopro action cam with RTMP output stream. But input of my algorithm compressor need RTSP stream.
I found Wowza solution can transcode from RTMP to RTSP but it very expensive.
How can I convert RTMP to RTSP?

How to Stream rtsp video which getting using live555 library to Wowza server which is running in local cloud

I have cross compiled live555 library for my android board and using that I can able to stream video board to any other device using rtsp protocol.
I have used "live555MediaServer" program for stream board to other devices. I got "rtsp" url and using that I am able to stream.
But now I want to push this video to wowza server which is running in our local cloud. while reading wowza document I found that you can use any supported encoder to stream video to wowza by registering wowza steaming engine in your encoder. I found following lines
In your encoder, enter the following information, and then click Publish or Start:
Server URL: rtsp://[wowza-ip-address]:1935/live
Stream Name: myStream
User: publisherName
password: [password]
so I want to know how can i register wowza ip with live555. In live555 is there any way we can register this wowza ip and stream video in wowza streaming engine ?
I found one application in "live555/testProgs/registerRTSPStream" while running this application with out any command line option it is showing below usage
usage: registerRTSPStream [-t] [-u <username> <password>] <remote-client-or-proxy-server-name-or-address> <remote-client-or-proxy-server-port-number> <rtsp-URL-to-register> [proxy-URL-suffix]
So Is it possible to register wowza server using this application and If yes then how to do it ?

P2P Audio stream Linux server software

I am in search of a server software which can stream different audios to a different clients.
For example every client will be able to create his own playlist and the server will stream it
Any help will be appreciated
You can check flash which has support for RTMP to stream audio real time using client server & RTMFP which works over peer to peer technology. You can use RTMFP in case peer is directly reachable else use RTMP. There is a open source red5 media server which also has support for RTMP protocol.

Use an IP-camera with webRTC

I want to use an IP camera with webrtc. However webrtc seems to support only webcams. So I try to convert the IP camera's stream to a virtual webcam.
I found software like IP Camera Adapter, but they don't work well (2-3 frames per second and delay of 2 seconds) and they work only on Windows, I prefer use Linux (if possible).
I try ffmpeg/avconv:
firstly, I created a virtual device with v4l2loopback (the command was: sudo modprobe v4l2loopback). The virtual device is detected and can be feed with a video (.avi) with a command like: ffmpeg -re -i testsrc.avi -f v4l2 /dev/video1
the stream from the IP camera is available with: rtsp://IP/play2.sdp for a Dlink DCS-5222L camera. This stream can be captured by ffmpeg.
My problem is to make the link between these two steps (receive the rstp stream and write it to the virtual webcam). I tried ffmpeg -re -i rtsp://192.168.1.16/play2.sdp -f video4linux2 -input_format mjpeg -i /dev/video0 but there is an error with v4l2 (v4l2 not found).
Does anyones has an idea how to use an IP camera with webRTC?
Short answer is, no. RTSP is not mentioned in the IETF standard for WebRTC and no browser currently has plans to support it. Link to Chrome discussion.
Longer answer is that if you are truly sold out on this idea, you will have to build a webrtc gateway/breaker utilizing the native WebRTC API.
Start a WebRTC session between you browser and your breaker
Grab the IP Camera feed with your gateway/breaker
Encrypt and push the rtp stream to your WebRTC session from your RTSP stream gathered by the breaker through the WebRTC API.
This is how others have done it and how it will have to be done.
UPDATE 7/30/2014:
I have experimented with the janus-gateway and I believe the streaming plugin does EXACTLY this as it can grab an rtp stream and push it to an webrtc peer. For RTSP, you could probably create RTSP client(possibly using a library like gstreamer), then push the RTP and RTCP from the connection to the WebRTC peer.
Janus-gateway recently added a simple RTSP support (based on libcurl) to its streaming plugins since this commit
Then it is possible to configure the gateway to negotiate RTSP with the camera and relay the RTP thought WebRTC adding in the streaming plugins configuration <prefix>/etc/janus/janus.plugin.streaming.cfg
[camera]
type = rtsp
id = 99
description = Dlink DCS-5222L camera
audio = no
video = yes
url=rtsp://192.168.1.16/play2.sdp
Next you will be able to access to the WebRTC stream using the streaming demo page http://..../demos/streamingtest.html
I have created a simple example transforming a RTSP or HTTP video feed into a WebRTC stream. This example is based on Kurento Media Server (KMS) and requires having it installed for the example to work.
Install KMS and enjoy ...
https://github.com/lulop-k/kurento-rtsp2webrtc
UPDATE 22-09-2015.
Check this post for a technical explanation on why transcoding is just part of the solution to this problem.
If you have video4linux installed, the following command will create a virtual webcam from an rtsp stream:
gst-launch rtspsrc location=rtsp://192.168.2.18/play.spd ! decodebin ! v4l2sink device=/dev/video1
You were on the right track, the "decodebin" was the missing link.
For those who would like to get their hands dirty with some native-WebRTC, read on...
You could try streaming an IP camera’s RTSP stream through a simple ffmpeg-webrtc wrapper: https://github.com/TekuConcept/WebRTCExamples .
It uses the VideoCaptureModule and AudioDeviceModule abstract classes to inject raw media. Under the hood, these abstract classes are extended for all platform-specific hardware like video4linux or alsa-audio.
The wrapper uses the ffmpeg CLI tools, but I don’t feel it should be too difficult to use the ffmpeg C-libraries themself. (The wrapper relies on transcoding, or decoding the source media, and then letting WebRTC re-encode with respect to the ICE connections’ requirements. Still working out pre-encoded media pass-through.)
Actually our camera can support webrtc. It uses ip camera with h5, from P2P tramsmitting, and two way talk for ip camera with web browser! The delay is only 300ms!

Resources