I'm attempting to proxy a video connection from another server, and I need it to operate dynamically in VLC. Something like a stream. The video should be able to seek forward and backward quickly in VLC, maybe utilizing the range?
I have an mp4 link.
I'd want to use it as a mask to proxy into another URL.
Related
I just want to know to get webrtc media stream to the node js server and serves it to the clients connected to that server.
I already created P2P WebRTC application with node js signalling server and it works fine. But now wants to route the media stream through the server. It should have low latency and delay to make it work at live server.
What you need is a SFU (Selective Forwarding Unit). Like these
mediasoup
janus-gateway
kurento
Here is a github project where it is implemented using mediasoup.
I want to play an audio on WebSite but link of audio have to secret or encrypted on NodeJS
example: Spotify etc.
What I would suggest is to have your socket.io server which can send you an appropriate audio file for your request.
Once you are connected to socket sever send the name of the audio file to your server and the server will send the audio file over the socket.
After receiving the file you can play as you want.
This way you will not expose the actual URL of the audio file because that is handled by your server.
I've successfully set up Asterisk on my server using the res_pjsip Hello World configuration from their wiki, and I want to be able to forward the RTP data to a Node.JS app, which can interpret RTP. I've heard about directmedia and directrtpsetup (see this stackoverflow) but I'm not sure if that's what I want. So my question is this:
Should I use directmedia / directrtpsetup to send voice data to my Node.JS app, or should I use some sort of Asterisk functionality to forward RTP packets? If the latter, how can Asterisk forward just the voice data?
I can clarify if needed, but hopefully this is more specific than my last questions. Thanks!
UPDATE: Having poked around Asterisk docs and messing with Wireshark, I think I have two options.
Figure out if there's a channel driver for Asterisk that just sends RTP, without any signaling, or
Capture the RTP stream with Wireshark or something and send the packets to the Node.JS app, and inject the return packets into the RTP stream.
Asterisk is PBX. It not suitable to "redirect rtp data"
No, there are no reason in having channel driver "without signalling". For what anyone can use it? How to determine call started if "no signaling"? It will be useless.
You can write such app in c/c++ or use other soft designed to be traffic capture: libpcap, tcpdump etc.
You also can use audio staff: libalsa, jack.
Best option however will be create or find full featured sip client and use it.
I have RTSP URL coming from back end and I have to run rtsp live streaming in my web GUI written in angular js.Currently i am using VXG player plugin to run that RTSP URL but this player is only supported in Chrome.Is there any solution to run that rtsp URL in all browsers
E.x: Chrome, Mozilla, IE, Safari, Microsoft Edge?
Thanks in Advance.
Rtsp is not a supported protocol by most browsers. If you need it to work on all browsers, you must use a protocol that works on all browsers. Like http.
Since RTSP is not supported directly by the browser it needs some back end server which captures the stream using web sockets so it is not like normal http where we need to wait for response web sockets is two-way duplex communication.And again in browser it can play only mp4 videos so there should be player which ever the format the stream is and convert it browser understandable format.
some url which will be helpful understand this concept
https://streamedian.com/docs/#description
And another using node js which uses websocket and jsmpeg player you can implement rtsp streaming in browser using Angular JS
https://www.npmjs.com/package/node-rtsp-stream
https://github.com/phoboslab/jsmpeg -- jsmpeg player
and server side you should install ffmpeg. so ffmpeg converts your stream.
Note If try to implement Node Js on please install the ffmpeg in the server side
I'm implementing a solution for listening to on-going calls inside a LAN network.
Is there a way to provide WebRTC the ip address and port as to where an RTP stream is coming? All I want to do is to get that RTP stream directly streamed to the possible listeners of the call through WebRTC.
I'm not sure if it's feasible but I think it is given how WebRTC has evolved since the past months.
I've been looking around but I've got no luck on this.
The WebRTC RTP stream is encrypted with keys that are exchanged through DTLS. You cannot get the raw RTP stream from a WebRTC peer or even feed it a raw stream without some mediary system to handle the webrtc peerconnection, certificate exchange, and rtp encryption.
The only way to do what you want is to have a breaker or a gateway. An example of such a gateway is the janus-gateway though it is definitely not your only option.