how do i stream a video from RTSP to react - node.js

i'm using nodejs and react (both with typescript) and i'm trying to get the video from an IP Camera through RTSP,
i tried using node-rtsp-stream but this library uses jsmpeg as a consumer in the front-end and it does not work with react & typescript.
i need some recommandations or other solutions

I have an idea that we can transfer the RTSP to HTTP format(like HLS) first, then many of the video players and js libraries are able to play it.

Related

Web Radio Creation Through Node.js

I wanted to create my own web radio in node.js, which will include a functionality to create a library by admin. this library will holds sngs in it. while my songs get picked on backend and played on frontend to all connected users. How can I do it?
You're question is very broad. but am going to answer the part of audio stream
as below
on you're node js server install a package called wrtc which will receive stream from producer and broadcast that stream to other peers which peers uses normal browser/mobile webrtc endpoints

How do I receive video stream data in node server?

I don't know how to get started with this.
What I am trying to do is get a video + audio stream from front-end and host the live stream as mp4 thats accessible on browser.
I was able to find information on WebRTC, socket.io, rtmp, but I'm not really sure what tool to use / whats best suited for something like this?
also follow up question, my front-end is iOS app. So what format would I send the live stream to the server?
It depends on which live streaming protocol you want to play on the player, as #Brad said, HLS is the most common protocol for player.
Note: Besides of HLS, iOS native app is able to use fijkplayer or FFmpeg to play any format of live streaming, like HLS, RTMP or HTTP-FLV, even MKV. However, the most straight forward solution is HLS, only need a tag to play MP4 or HLS, and MSE is also a optional solution to use flv.js/hls.js to play live streaming on iOS/Android/PC, this post is about these protocols.
The stream flow is like this:
FFmpeg/OBS ---RTMP--->--+
+--> Media Server---> HLS/HTTP-FLV---> Player
Browser ----WebRTC--->--+
The protocol to push to media server, or receive in node server, depends on your encoder, by RTMP or H5(WebRTC):
For RTMP, you could use FFmpeg or OBS to push stream to your media server.
If want to push stream by H5, the only way is use WebRTC.
The media server coverts the protocol from publisher to player, which use different protocols in live streaming right now(at 2022.01), please read more from this post.

Live camera streaming in Node js and Reactjs

I am trying to live stream the data from one of our camera into the browser.I can connect to camera and converting RTSP to HLS using FFMPEG in Node JS. All the segments are written to local folder in Node JS. But how do i send this ReactJS continuously.
Backend
----->src
----->videos
output.m3u8
output1.ts
output2.ts
Since you're using HLS, the client simply requests the segments as it needs them. You simply serve these files like any other static files, via any standard web server.

How to achieve RTSP web-assembly module

I have an application where Wasm can handle video & audio codes, which can be passed to Media Source Ext for rendering purposes. To get RTSP & RTP stream data on the browser, needs RTSP wasm module which can consume RTSP streams.
Is there any way we can achieve it?
WebAssembly itself does not provide any APIs for interacting with the browser or OS. All I/O must be handled via imports and exports. So you would need to implement JavaScript functions that send and receive the data and pass it in and out of your WebAssembly module.

What is the best solution for Live-cam Service in web application?

I am trying to setup a web-based Live webcam streaming service(Using laravel framework php) where a user can broadcast live via webcam (Web based Only).
For example:
User X Starts a webcam-Broadcast at http://localhost/userx while Users Y,Z etc join that room on http://localhost/userx will be able to watch the live webcam/stream.
I was playing around with node.js and socket.io library for realtime chat and it works fine.
But I have no idea about webcam streaming.
Should i use webrtc? How many viewers can handle the broadcaster if i use Webrtc ?
What is best solution for handling around 1000-2000 viewers?
Any suggestion would help me a lot.
Why not use the node-camera module which enables you to access and stream web camera in nodejs using opencv and websockets.
This the command you should run in order to run it:
npm start -- [-open] [-wsport websocketPort] [-webport webserverport] [-res widthxheight]
where the options passed to run it are:
-open Open streaming url on startup
-wsport Web socket port for streaming media
-webport Web server port
-res Resolution for preview image
-input Input source. ( eg. ip camera url)
There are few more libraries such as ffmpeg, vlc and OpenCV which are available using webcam access that can be written as node's native addon

Resources