Re-distributing RTSP stream with Windows Media Services - windows-server-2008-r2

We have an IP camera, that streams RTSP. It cannot take many connections, so I've setup windows media services, on one of our servers, hoping that I could use that as the access-point, for the stream.
Is it possible, somehow, to set the RTSP stream, as the source for the media services stream?
From what I've gathered, Media Services doesn't support this (go figure), but there must be work-arounds. I've tried streaming it to http with VLC, in an attempt to grab the http stream on localhost, but VLC seems to crash, whenever it grabs the RTSP stream.
I'm on Server 2008 R2 (64 bit). Running VLC on our workstations, and grabbing the camera stream, doesn't crash (XP, 32 bit).
Any ideas, on how I could get this to work?

So, I'm affraid I'll have to answer myself.
I've ended up making a MMS:// stream, with VLC player. VLC grabs the RTSP form the camera, and re-hosts as an MMS. I then grab the MMS with a silverlight app, on the webserver, and hosts it there.
I ended up un-installing media-services again.

Related

How to play RTSP stream from ip video camera and NVR on user web page

I want to play RTSP stream from ip video cameras (MP4, H264) on my intranet web page, I use React. I have 12 cameras and NVR.
I did not find a way to do this without an intermediate server (Webrtc is not suitable), that spends resources on transcoding h264 stream to the mjpeg.
If I set a high resolution and quality of the stream, then a lot of resources are spent on transcoding, and most importantly, the streaming of mjpeg images takes a lot of traffic.
Is there a way or solution to stream from the ip camera directly to the web page so that the decoding is on the user's webbrowser side.
This will free the intermediate server from a heavy load for big streams.
It is necessary that the playback work on mobile phones.
Thanks for the answer.
There is no way to stream RTSP camera's H264 video directly to web browser.
But cameras support outputting still jpeg images - you can create a webpage that will display such an image from a camera every 200ms or so.
If you are not happy with the above solution, you must use a media server in between, which will pull RTSP stream from the camera and will convert it to some protocol that browser understands. You are mistaken in one thing: no video transcoding is involved. I don't know why WebRTC is not an option for you, but most media servers will offer 4 types of output:
Low latency:
WebRTC
Websockets to MSE
High latency:
HLS
MPEG-Dash
All these methods do NOT require transcoding of your original H264 video, encoded by RTSP camera/NVR. Some media servers you can use: Unreal Media Server, Wowza, Janus.
Live demo: http://www.umediaserver.net/umediaserver/demos.html
No browser has native RTSP support, so if you want decoding to happen on the end user side, then you'll have to write your own custom web player.
You can start by looking at the open-source solution like this one:
git://github.com/Streamedian/html5_rtsp_player.git
It works on PC and Android, but didn't work with iPhone for me (but you can try it for yourself https://streamedian.com/demonstration/ maybe it's just my issue), but maybe you can find better alternative or fork it and make it work on all devices.
It still requires a middle-man proxy server though because it uses a websocket tech to work, but since it doesn't do any video converting or decoding, it don't suppose to take any resources at all.

How to play a live video in the browser?

I need to get live video from a device. I have to play the video on the browser. live video can be received as RTP or UDP.
Since there is no support for VLC, I published the video by getting it via RTP with FFMPEG and creating a web server with Nginx.
But later I realized that it is recording video tracks to disk. This is a situation I don't want.
Is there any other way to do this?
Not with RTP or UDP, no, there is no way. You must use WebRTC, or an HTTP based method like HLS or DASH.

Using ffmpeg to stream live video from a raspberry pi to a web server for distribution

I am trying to build a device that will encode h.264 video on a raspberrypi and stream it out to a separate web server in the cloud. The main issue I am having is most implementations I search for either have the web server directly on the pi or have the embedded player playing video directly from the device.
I would like it to be pretty much plug and play no matter what network I am on ie no port forwarding of any sort all I need to do is connect the device to the network and the stream will be visible on a webpage.
One possible solution to the issue is just simply encode frames in base 64 as jpegs and send them to a an endpoint on the webserver, however, this is a huge waste of bandwidth and wont allow for the framerate h.264 would.
Any idea on some possible technologies that could be used to do this?
I feel like it can be done with some websockets or zmq and ffmpeg somehow but I am not sure.
It would be helpful if you could provide more description of the architecture of the device. Since it is an RPI, it is probably also being used for video acquisition via the camera expansion port. If this is the case, you can access the video device and do quite a bit with respect to streaming using the combination of available command line tools.
Something like the following will produce an RTMP stream from the video camera host.
raspivid [preferred options] -o - | ffmpeg -i - [preferred options] rtmp://[IP ADDR]/[location]
From there, FFmpeg will do a lot of heavy lifting for you.
This will now enable remote hosts to access the RTMP stream.
Other tools that would complement that architecture may be ffserver where the rtmp stream from the rpi host could be acquired and then be made available to a variety of clients such as a player in a webpage. Quick look shows ffserver may be obsolete, but that there are analogous components.

RTP stream with bareSIP

my current setup involves streaming from a GoPro to a linux box, and I managed to get bareSIP running on the box to stream the video locally with the 'v' command. However, there's no documentation or commands to configure an RTP broadcasting stream. Would anyone have any insight into publishing an RTP/RTSP output stream for other users to view on their devices?
I've used Unreal Streaming Media components and found them to be very good. They are lightweight and fast yet very powerful.
Using Unreal components you could install the stream forwarder on your laptop, point it at the RTSP stream and tell it to forward to the Distribution server application.
This app can host thousands of connections (supposedly) and last I looked you didn't need a license if you have 3 or fewer sources. The stream can be viewd via their own small player app, via a web player such as jPlayer or via VLC etc.
I've been pretty happy with this before - it saved me from having to use the Live555 streaming mess.
Good Luck!

Stream recorded audio from browser to server

I would like to live stream recorded audio from the browser to the server and play it. The server will end up being a embedded device that plays these audio streams.
So far I've successfully recorded audio and encoded it into a WAVE file and play it on the browser using the web audio API and following this tutorial.
Now I have a stream of .WAV encoded blobs. I tried finding ways to stream these to a nodejs backend with a web socket connection and play them using a npm module. But I haven't had any luck.
Does anyone know of any resources or modules I should follow? Maybe I should try a different approach? The audio needs to be played relatively quickly on the server since recording on the browser.
I'm doing this currently with some software that allows for streaming to internet radio servers via your web browser.
I use the WebAudio API along with getUserMedia to get live PCM audio data from the sound device. From there, I convert this data from 32-bit float to 16, 12, or 8 bit data depending on the amount of bandwidth available. This converted int samples are written to a stream setup with BinaryJS which wraps streams on both the Node.js and the client. As a bonus with BinaryJS, you can have as many streams open as you want, so I use a second stream over the same WebSocket connection for control data.
http://demo.audiopump.co:3000/

Resources