The Janus server is able to replay the RTP stream.
Is there a way to play a RTP stream directly into a video html5 element ?
(I don't really get the difference between RTP and RTSP)
And how can I play the RTP stream: should I transcode it to some HLS ?
You don't, it's not supported in HTML5. I'd recommend transcoding it to DASH and/or HLS, using either open source tools like ffmpeg or commercial solutions like bitmovin.
Webrtc is supported in HTML5, so you can view the video on the browser.
Janus Server supports different plugin for RTSP/RTP, which will receive data in RTSP/RTP and then send that data to the web browser client using webrtc.
https://janus.conf.meetecho.com/docs/streaming.html
Related
I want to play RTSP stream from ip video cameras (MP4, H264) on my intranet web page, I use React. I have 12 cameras and NVR.
I did not find a way to do this without an intermediate server (Webrtc is not suitable), that spends resources on transcoding h264 stream to the mjpeg.
If I set a high resolution and quality of the stream, then a lot of resources are spent on transcoding, and most importantly, the streaming of mjpeg images takes a lot of traffic.
Is there a way or solution to stream from the ip camera directly to the web page so that the decoding is on the user's webbrowser side.
This will free the intermediate server from a heavy load for big streams.
It is necessary that the playback work on mobile phones.
Thanks for the answer.
There is no way to stream RTSP camera's H264 video directly to web browser.
But cameras support outputting still jpeg images - you can create a webpage that will display such an image from a camera every 200ms or so.
If you are not happy with the above solution, you must use a media server in between, which will pull RTSP stream from the camera and will convert it to some protocol that browser understands. You are mistaken in one thing: no video transcoding is involved. I don't know why WebRTC is not an option for you, but most media servers will offer 4 types of output:
Low latency:
WebRTC
Websockets to MSE
High latency:
HLS
MPEG-Dash
All these methods do NOT require transcoding of your original H264 video, encoded by RTSP camera/NVR. Some media servers you can use: Unreal Media Server, Wowza, Janus.
Live demo: http://www.umediaserver.net/umediaserver/demos.html
No browser has native RTSP support, so if you want decoding to happen on the end user side, then you'll have to write your own custom web player.
You can start by looking at the open-source solution like this one:
git://github.com/Streamedian/html5_rtsp_player.git
It works on PC and Android, but didn't work with iPhone for me (but you can try it for yourself https://streamedian.com/demonstration/ maybe it's just my issue), but maybe you can find better alternative or fork it and make it work on all devices.
It still requires a middle-man proxy server though because it uses a websocket tech to work, but since it doesn't do any video converting or decoding, it don't suppose to take any resources at all.
I need to get live video from a device. I have to play the video on the browser. live video can be received as RTP or UDP.
Since there is no support for VLC, I published the video by getting it via RTP with FFMPEG and creating a web server with Nginx.
But later I realized that it is recording video tracks to disk. This is a situation I don't want.
Is there any other way to do this?
Not with RTP or UDP, no, there is no way. You must use WebRTC, or an HTTP based method like HLS or DASH.
We generate RTSP stream (MP4 with ACC codec for audio) on our server and we need to send it to web app and play it.
We could send it via websocket and play it with media extensions but they are not supported on iOS.
We could also use WebRTC with media channel but that supports only Opus audio codec and we cannot afford transcoding from ACC to Opus.
Do you have any idea how can we play RTSP data on iOS devices?
EDIT: we aim for low latency playback (<1s) HSL has latency 5s+
you need to encode/package your stream in HLS on your server to send it to iOS clients. Try to look into FFMPEG streaming guides where the input is your RTSP stream and output is HLS. iOS really only plays HLS.
What's the best way to do so If I don't want to use flash?
I heard about WebRTC but is it viable to take input from getUserMedia
and stream it to a media server which will then move that data to an rtmp stream?
It depends on the media server you use. For example, Wowza has been adding WebRTC support recently: https://www.wowza.com/products/capabilities/webrtc-streaming-software
I don't know of any way to get WebRTC input to a media server that doesn't support it.
I was wondering if there is any way to cast an RTMP audio and/or video stream? I've created a receiver app, whitelisted it, and able to access it on my Chromecast. I tried to embed an SWF object but it appears that the Chromecast does not support flash natively like this. Is there any workaround?
I see the docs for supported media types and it doesn't list RTMP so I'm thinking it's a no-go, other than doing some on-the-fly stream protocol translation from RTMP to MP3 or such on another server.
AFAIK, receiver apps can only work with HTML5 media, which doesn't support the RTMP protocol (also keep in mind that RTMP is not a media format, but a media transfer protocol which can carry various different container formats and audio/video codecs); as Chromecast requires HTML5, it also requires HTTP as the transfer protocol).
You could set up a proxy as your receiver app that would consume an RTMP stream and then re-cast it as an http-based HTML5 media resource. There wouldn't be a lot of overhead if you didn't have to do any transcoding (say, for example, your RTMP stream was H.264 video with MP3 audio), but it could get messy having to wrap all the media resource handling that Chromecast can do just to talk to an RTMP server. It would likely be much easier to work with HTML5 video from the get-go.