Python3: emulate local webcam device with OpenCV on win10 - python-3.x

I want to stream my processed OpenCV output in Python as a local webcam stream so it can be used by other programs as any other webcam. Is it possible? Are there any libraries that can do it?
I've read through some stackoverflow questions and found this: Stream OpenCV output as emulated webcam? that is pretty similar to my problem (but in Java/C++)
instead of doing:
cv2.imshow("...", output)
every frame I want to have a stream that I can supply images and that would then be considered as a webcam by other programs.

I also went through this trouble and believe me I got a really better solution for every platform.
Just install droidcam on your machine. Find the port it uses to connect with the mobile application. Normally it is port 4747. Then just connect to it using the python socket. Start the server on the local host and port 4747. It will connect and stream video to that.
Droid cam can be used with any other software like Zoom.

Related

Play video from one device to another

I’m looking to essentially use two devices: raspberry pi 3 and Mac 10.15. I am using the pi to capture video from my web cam and I want to use my Mac to kind of extend to the pi so when I use cv2.videocapture I can capture that same video in preferably real-time or something close. I’m programming this using python on bout devices. I thought of putting it on a local server and retrieving it but I have no idea how I could use that with opencv. If someone could provide and explain a useful example, I would greatly appreciate it. Thank you.
To transfer a video stream, you could use instead of a custom solution a RTMP server on the source machine feeding it with the cam source and the target opens the stream and processes it.
A similar approach to mine is widely implemented into IP cameras: They run a RTMP server to make the stream available for phones and PC.

Using ffmpeg to stream live video from a raspberry pi to a web server for distribution

I am trying to build a device that will encode h.264 video on a raspberrypi and stream it out to a separate web server in the cloud. The main issue I am having is most implementations I search for either have the web server directly on the pi or have the embedded player playing video directly from the device.
I would like it to be pretty much plug and play no matter what network I am on ie no port forwarding of any sort all I need to do is connect the device to the network and the stream will be visible on a webpage.
One possible solution to the issue is just simply encode frames in base 64 as jpegs and send them to a an endpoint on the webserver, however, this is a huge waste of bandwidth and wont allow for the framerate h.264 would.
Any idea on some possible technologies that could be used to do this?
I feel like it can be done with some websockets or zmq and ffmpeg somehow but I am not sure.
It would be helpful if you could provide more description of the architecture of the device. Since it is an RPI, it is probably also being used for video acquisition via the camera expansion port. If this is the case, you can access the video device and do quite a bit with respect to streaming using the combination of available command line tools.
Something like the following will produce an RTMP stream from the video camera host.
raspivid [preferred options] -o - | ffmpeg -i - [preferred options] rtmp://[IP ADDR]/[location]
From there, FFmpeg will do a lot of heavy lifting for you.
This will now enable remote hosts to access the RTMP stream.
Other tools that would complement that architecture may be ffserver where the rtmp stream from the rpi host could be acquired and then be made available to a variety of clients such as a player in a webpage. Quick look shows ffserver may be obsolete, but that there are analogous components.

Live streaming from UWP to Linux/Python Server

I have an UWP app that capture a live video stream (webcam), encodes it in h264, and sends it through a TCP socket (in a local network, I need high performance) to a Linux device.
Is there a way to do this? I need the video not for playing it but for extract single frames. I could do that with opencv but it requires a local video file, instead I'm using a live stream.
I would send photos instead of a video stream if the time needed for capture one was acceptable, but it requires about 250 ms.
Is RTP required? Does UWP (windows) provides a way to achive this?
Thank you
P.S.: The UWP app runs in Hololens.
You can use WebRTC to transmit live video from the HoloLens easily to any target. That's probably the easiest way to do it without going really low level.
For an introduction just grab this repo and try the sample app which runs perfectly on the HoloLens https://github.com/webrtc-uwp/PeerCC/tree/e95f231e1dc9c248ca2ffa040276b8a1265da145/Client

Jack2 internal audio routing in Ubuntu 14.04

I am trying to accomplish the following in Ubuntu 14.04.
I have installed the SIP client Linphone and want to connect its audio to Adobe Connect that runs in a browser (Firefox, for instance). So what I need is a two-way communication such that:
Linphone audio output --> Adobe Connect audio input
Adobe Connect audio output --> Linphone audio input
I have understood that Jack2 (http://jackaudio.org/) is supposed to be able to route audio in between different applications. I guess what I have to do here is to configure it so that all the audio that comes from Firefox is routed to Linphone's input and all the audio that comes from Linphone is routed to Firefox's input.
I succeeded in installing Jack2 with QjackCtl but I am unable to configure it. When going to "Connect -> Audio" I was expecting to be able to select between the various running applications in order to reroute the audio. Instead, what I can do is only to connect my microphone's input to either of my speakers.
What would be the right workflow to follow here? Do I have to configure some virtual microphones/speakers to make it work? If so, how?
Any help would be greatly appreciated.

AUNetSend to Linux

Got a mac laptop, and looking to stream live audio to a linux box.
I very much like the AUNetSend module in AU Lab on the mac, and would like to stream the audio using this to a linux machine. Is this possible?
I've tried connecting through VLC and there seems to be some sort of HTTP server going on there but not really sure...
Any ideas if it is possible, how I could do it, or any other ways for me to send live audio from a microphone input (soundflower) on a mac to a linux box with very low latency?

Resources