I am trying to connect Remote RTSP Stream from IP Camera, but unable to get it without port Forwarding or Client Application. Is there any procedure to connect RTSP stream from one network to other network without port forwarding?
I have already tried with application client from hikvision and checked Wireshark streams but unable to get it
cam = cv2.VideoCapture("rtsp")
Take a look at the Nabto P2P platform, it is designed exactly to solve this problem, free for personal use and test / R&D. The Nabto blog has specific examples for setting up RTSP P2P tunnelling on cameras, Raspberry Pi devices and ESP32 devices:
You can either install the Nabto P2P reverse proxy in front of your RTSP service on the camera as a standalone process - or you can integrate it into an existing application. Full source is available in github.
On the client side, you can use an existing Nabto enabled RTSP client - or build your own based on the client SDKs, available for most popular platforms. Existing client apps available in github and in App store / Google Play.
Related
How can i send all webcams to collect from one server.
For example:
there is pc_1, pc2, ..., pc_n they are sending camera view to some ubuntu server where i can connect with
ssh name#ip_adress
and all pc have a windows on them
i looked Sending live video frame over network in python opencv this but this worked only on localhost
and secondly i looked this Forward RTSP stream to remote socket (RTSP Proxy?) but couldnt figure out how to do it on my situation
Each IPC is a RTSP server, it allows you to pull/play RTSP stream from it:
IPC ---RTSP--> Client(Player/FFmpeg/OBS/VLC etc.)
And because it's a internal IPC and its IP is intranet, so the client should in the same intranet, that's why it works only on localhost like.
Rather than pulling from the internet client which does not work, you could forward the stream to internet server, just like this:
IPC ---RTSP--> Client --RTMP--> Internet Server(SRS/Nginx etc.)
For example, use FFmpeg as a Client to do this, please replace the xxx by your internet server:
ffmpeg -i "rtsp://user:password#ip" -c:v libx264 -f flv rtmp://xxx/live/stream
Note: You could fastly deploy a internet server by srs-droplet-template in 3 minutes, without any cli or knowledge about media server.
Then you could play the stream by any client and any protocol, like PC/H5 by HTTP-FLV/HLS/WebRTC, mobile iOS/Android by HTTP-FLV/HLS, please read this post
I need some help structuring an App that allows for a speaker who can send audio via his/her phone mic and then for users to listen to this audio stream. I don't want to upload anything to the cloud as there may be bad internet reception.
What technologies to use? How do they talk to together?
Currently I have:
Have a mobile dongle that everyone connects to
host a server on this dongle?
Send an live audio stream to this server via UDP
Users listen to this server
Ideally it would be great if I could build this in React Native, which I think is possible. I am not sure how to host a server on a dongle though or even if this the best method?
I'm trying to develop a web application in nodejs. I'm using an npm package called "simple-peer" but i don't think this issue is related to that. I was able to use this package and get it working when integrating it with a laravel application using an apache server as the back end. I could access the host machine through it's IP:PORT on the network and connect a separate client to the host successfully with a peer-to-peer connection. However, I'm now trying to develop this specifically in node without an apache back end. I have my express server up and running on port 3000, I can access the index page from a remote client on the same network through IP:3000. But when I try to connect through webrtc, I get a "Connection failed" error. If I connect two different browser instances on the same localhost device, the connection succeeds.
For reference: i'm just using the copy/pasted code from this usage demo. I have the "simplepeer.min.js" included and referenced in the correct directory.
So my main questions are: is there a setting or some webRTC protocol that could be blocking the remote clients from connecting? What would I need to change to meet this requirement? Why would it work in a laravel/webpack app with apache and not with express?
If your remote clients can not get icecandidates, you need TURN server.
When WebRTC Peer behind NAT, firewall or using Cellular Network(like smartphone), P2P Connection will fail.
At that time, for fallback, TURN server will work as a relay server.
I recommend coTURN.
Here is an simple implementation of simple-peer with nodejs backend for multi-user video/audio chat. You can find the client code in /public/js/main.js. Github Project and the Demo.
And just like #JinhoJang said. You do need a turn server to pass the information. Here is a list of public stun/turn servers.
I want to stream calls on freeswitch to a node.js websocket server (url: wss//localhost:8755/{callUUID})
The only thing I can find is using mod_shout to stream to an icecast server.
<action application="record" data="shout://source:pass#10.10.10.10:8000/stream.mp3"/>
or
conference 3001-10.10.10.10 record shout://source:pass#10.10.10.10:8000/stream.mp3
Is there a way to do session_record and stream to the websocket server?
Thank you.
There are modules :
mod_unimrcp
mod_unimrcp is the FreeSWITCH module that allows communication with Media Resource Control Protocol (MRCP) servers. MRCP allows client machines to control media resources on a network. MRCP version 1 uses the Real Time Streaming Protocol (RTSP) while version 2 uses the Session Initiation Protocol (SIP) to negotiate the MRCP connection. mod_unimrcp allows FreeSWITCH to act as such a client. Servers are supplied by numerous vendors such as Cepstral, Voxeo, Nuance, and many others.
mod_vlc
<action application="record" data ="vlc://#standard{access=http,mux=raw,dst=localip:someport/somevariable}"/>
mod_rtmp
mod_rtmp is an RTMP (Real time media protocol) endpoint for FreeSWITCH. The RTMP protocol is primarily used by Flash for streaming audio, video, and data over the Internet.
I want to create a sort of remote control from my smartphone to my pc. I've created a server in my pc written with nodejs (using expressjs) and listening in a port. I want to make an app for my smartphone with cordova but I don't know how the app can know automatically the ip address of the server running in pc in my local network.
To my knowledge the best approach is to create a UDP server listening for broadcasts and send an UDP broadcast to the network segment from your APP. I have this working with Python. As long as your local router/access point is not blocking UDP broadcasts you should be fine.