AUNetSend to Linux - linux

Got a mac laptop, and looking to stream live audio to a linux box.
I very much like the AUNetSend module in AU Lab on the mac, and would like to stream the audio using this to a linux machine. Is this possible?
I've tried connecting through VLC and there seems to be some sort of HTTP server going on there but not really sure...
Any ideas if it is possible, how I could do it, or any other ways for me to send live audio from a microphone input (soundflower) on a mac to a linux box with very low latency?

Related

Play video from one device to another

I’m looking to essentially use two devices: raspberry pi 3 and Mac 10.15. I am using the pi to capture video from my web cam and I want to use my Mac to kind of extend to the pi so when I use cv2.videocapture I can capture that same video in preferably real-time or something close. I’m programming this using python on bout devices. I thought of putting it on a local server and retrieving it but I have no idea how I could use that with opencv. If someone could provide and explain a useful example, I would greatly appreciate it. Thank you.
To transfer a video stream, you could use instead of a custom solution a RTMP server on the source machine feeding it with the cam source and the target opens the stream and processes it.
A similar approach to mine is widely implemented into IP cameras: They run a RTMP server to make the stream available for phones and PC.

Python3: emulate local webcam device with OpenCV on win10

I want to stream my processed OpenCV output in Python as a local webcam stream so it can be used by other programs as any other webcam. Is it possible? Are there any libraries that can do it?
I've read through some stackoverflow questions and found this: Stream OpenCV output as emulated webcam? that is pretty similar to my problem (but in Java/C++)
instead of doing:
cv2.imshow("...", output)
every frame I want to have a stream that I can supply images and that would then be considered as a webcam by other programs.
I also went through this trouble and believe me I got a really better solution for every platform.
Just install droidcam on your machine. Find the port it uses to connect with the mobile application. Normally it is port 4747. Then just connect to it using the python socket. Start the server on the local host and port 4747. It will connect and stream video to that.
Droid cam can be used with any other software like Zoom.

Using ffmpeg to stream live video from a raspberry pi to a web server for distribution

I am trying to build a device that will encode h.264 video on a raspberrypi and stream it out to a separate web server in the cloud. The main issue I am having is most implementations I search for either have the web server directly on the pi or have the embedded player playing video directly from the device.
I would like it to be pretty much plug and play no matter what network I am on ie no port forwarding of any sort all I need to do is connect the device to the network and the stream will be visible on a webpage.
One possible solution to the issue is just simply encode frames in base 64 as jpegs and send them to a an endpoint on the webserver, however, this is a huge waste of bandwidth and wont allow for the framerate h.264 would.
Any idea on some possible technologies that could be used to do this?
I feel like it can be done with some websockets or zmq and ffmpeg somehow but I am not sure.
It would be helpful if you could provide more description of the architecture of the device. Since it is an RPI, it is probably also being used for video acquisition via the camera expansion port. If this is the case, you can access the video device and do quite a bit with respect to streaming using the combination of available command line tools.
Something like the following will produce an RTMP stream from the video camera host.
raspivid [preferred options] -o - | ffmpeg -i - [preferred options] rtmp://[IP ADDR]/[location]
From there, FFmpeg will do a lot of heavy lifting for you.
This will now enable remote hosts to access the RTMP stream.
Other tools that would complement that architecture may be ffserver where the rtmp stream from the rpi host could be acquired and then be made available to a variety of clients such as a player in a webpage. Quick look shows ffserver may be obsolete, but that there are analogous components.

Is a sound device necessary for an audio streaming server?

My project is to stream audio online with my PC as the server.
I have a HP Proliant ML110 G7 server PC, which does not have any integrated sound device in motherboard, nor any kind of sound device.
I am currently using ubuntu 16.04 in my PC, and I cannot configure IceCast and Ices2/Darkice properly, but I could do it following the same instructions in another laptop with same os same version, which has an integrated sound device.
Is an integrated sound device needed to make an audio streaming server?
Thank you.
Icecast itself just passes data on through. It requires no sound device at all.
Your source client, such as IceS, can be used to read audio from a sound device or just to read audio from files. If you have no sound device, you'll need to use some other audio source of course.

How to make server play sound when receiving request

I want to make a server play a soundbyte every time it receives a request. Is there a way to do this if I'm using a Go based server? The idea would be the server is hosting a browser window, it receives a request and the browser goes 'ping!'.
It depends on which operating system you want the code to work. Afaik there is no generic cross-platform solution for playing sound from go:
On Linux you might need to rely on Pulse Audio with a package such as github.com/mesilliac/pulse-simple
On Windows and Mac you could use PortAudio with a package such as github.com/gordonklaus/portaudio
If you want a practical example there is a go-based multi-source music player project called "moggio" at github.com/mjibson/moggio that plays audio from multiple sources on Linux, Mac and Windows.
You can have a look at the github.com/mjibson/moggio/output package. There you will find the code that moggio used to play music on Linux, Windows and Mac.

Resources