I have a docker container in which I want pass audio through, and alter sound using various standard utils before passing it onto a virtual sound card which will stream to another destination using icecast2 source.
The Host machine is a server without a sound card, and I have loaded snd_loop and exposed /dev/snd/* to docker with "docker --device /dev/snd:/dev/snd", but I still cannot access jack2, alsa or oss system in order to manipulate the stream.
Is it not possible to get a virtual sound card running in Docker?
Related
I would like to play audio on my raspberry pi, the audio player (console application) should in a docker container. I have seen multiple articles on the web, they commonly suggest to add the --device /dev/snd to the docker run. I just can't seem to get the audio through. Audio should be played on the device that's attached via HDMI cable. I tried sox player, I took ubuntu 18.04 as base image and also this sample solution. I'm flexible with with what the media player application is as this is a hobby project (would be nice that I can "pipe" an mp3 file to the program via bash while it is being streamed from somewhere). On the host I recently installed "Raspberry Pi OS with desktop and recommended software". (Release date is October 30th 2021, Kernel version: 5.10, whatever came with it). Pi 2 model B 1.1 (2014).
UPDATE: Audio does play on 3.5mm jack
It seems that by using the --device /dev/snd option, audio does play on the 3.5mm jack. I don't know though how to make this work on the HDMI.
I was wondering if it's possible to get the processes that are using the sound card at a specific time. For instance, I just want to know if there is any song currently playing in Spotify or Chrome, whatever. Thank you in advance.
As far as I am aware, a Linux application could be using via PulseAudio or directly accessing ALSA (Advanced Linux Sound Architecture) which forms the foundation of Linux Sound Architecture.
To see the processes utilizing ALSA, use the following command as root,
lsof /dev/snd/*
You will mostly see that pulseaudio is utilizing these. Now, to see the apps using sound devices via PulseAudio use
pacmd
>>> list-clients
That should give you a list of apps accessing pulseaudio and the process ID should be visible there.
I want to stream my processed OpenCV output in Python as a local webcam stream so it can be used by other programs as any other webcam. Is it possible? Are there any libraries that can do it?
I've read through some stackoverflow questions and found this: Stream OpenCV output as emulated webcam? that is pretty similar to my problem (but in Java/C++)
instead of doing:
cv2.imshow("...", output)
every frame I want to have a stream that I can supply images and that would then be considered as a webcam by other programs.
I also went through this trouble and believe me I got a really better solution for every platform.
Just install droidcam on your machine. Find the port it uses to connect with the mobile application. Normally it is port 4747. Then just connect to it using the python socket. Start the server on the local host and port 4747. It will connect and stream video to that.
Droid cam can be used with any other software like Zoom.
I am trying to build a device that will encode h.264 video on a raspberrypi and stream it out to a separate web server in the cloud. The main issue I am having is most implementations I search for either have the web server directly on the pi or have the embedded player playing video directly from the device.
I would like it to be pretty much plug and play no matter what network I am on ie no port forwarding of any sort all I need to do is connect the device to the network and the stream will be visible on a webpage.
One possible solution to the issue is just simply encode frames in base 64 as jpegs and send them to a an endpoint on the webserver, however, this is a huge waste of bandwidth and wont allow for the framerate h.264 would.
Any idea on some possible technologies that could be used to do this?
I feel like it can be done with some websockets or zmq and ffmpeg somehow but I am not sure.
It would be helpful if you could provide more description of the architecture of the device. Since it is an RPI, it is probably also being used for video acquisition via the camera expansion port. If this is the case, you can access the video device and do quite a bit with respect to streaming using the combination of available command line tools.
Something like the following will produce an RTMP stream from the video camera host.
raspivid [preferred options] -o - | ffmpeg -i - [preferred options] rtmp://[IP ADDR]/[location]
From there, FFmpeg will do a lot of heavy lifting for you.
This will now enable remote hosts to access the RTMP stream.
Other tools that would complement that architecture may be ffserver where the rtmp stream from the rpi host could be acquired and then be made available to a variety of clients such as a player in a webpage. Quick look shows ffserver may be obsolete, but that there are analogous components.
i want to cast live streaming video from Linux machine i.e Raspberry Pi to another Linux/Windows/Android Machine.
Both system have 4G Internet connection (No Public IP),
Is there a way that i can accomplish this thing.
Please help me.
There are several ways to achieve this. One of the widely used framework is GStreamer. You can think of media framework tool like GStreamer.
You can used UDP for communicating the data from your Pi to host machine.
GStreamer allows you to stream video with very low latency.
Reference of Gstreamer with Pi.