I'm not familiar with mplayer and the tons of option seem like a jungle to me, when all I want to do is show an endless looped gif animation at full screen on my RPi running raspbian.
Can anyone help me, please?
I am now working in Windows and using
mplayer -fs –mf fps=1:type=jpeg *.jpg mf://*.jpg –vo direct3d, -ao dsound-sub-fuzziness 1 -loop 100
to play jpg file in mplayer in fullscreen mode. I did't test on linux but i think
mplayer -fs –mf fps=1:type=gif *.gif mf://*.jpg -ao dsound-sub-fuzziness 1 -loop 100
should work for you too.
I video DJ using the RPi. Making mplayer full-screen is often harder than it should be. The switch to make mplayer full-screen is -fs. The problem is that many video outputs (-vo) do not support full-screen. If the -vo does not support -fs, I often get around this by using the RCA connection instead of the HDMI on a screen that does not display the entire frame. Try mplayer -fs file.gif and report back. Which -vo are you using?
Related
I have jitsi and jibri installed on the same server,
everything is working great,
the recording catch only (audio) without video
this is the Chrome settings:
"--headless",
"--ignore-certificate-errors",
"--use-fake-ui-for-media-stream",
"--start-maximized",
"--kiosk",
"--enabled",
"--enable-logging",
"--disable-infobars",
"--autoplay-policy=no-user-gesture-required",
"--disable-setuid-sandbox"
I successfully generate MP4 file, but black screen with audio
Update:
All what I have after recording done is MP4 video file with:
fixed background (earth and penguins around it with black sky full of stars), No real video
sound (good and clear audio)
any ideas?
thanks
I just find the problem,
just remove "--headless" to solve the missing video problem
Command line: nano /etc/jitsi/jibri/jibri.conf
remove "--headless" then save,
then run: systemctl restart nginx jitsi-videobridge2 prosody jicofo jibri
problem solved
thanks
if I execute:
speaker-test -D plughw:1,0 -t wav , sound gets played over the hdmi cable, as it is supposed to.
But if I type speaker-test -t wav , no sound gets played over the hdmi cable, so I guess the issue is the default output device.
Here is a list of what I've already tried:
change the output device in the top left corner on the Desktop
execute amixer cset numid=3 2 in the console
uncomment those 2 lines in the config:
hdmi_drive=2
hdmi_force_hotplug=1
change to those 2 lines in the alsa.conf file:
defaults.ctl.card 1
defaults.pcm.card 1 (also tried with 2 instead of 1)
My goal is to later play sound over a browser, but chrome seems to only be able to output sound over the default audio output device, so I guess I somehow need to change it.
I am a noob, so please tell if me if you need more informations and how to get them.
If anyone has some hits on how to solve this issue, I'd be really grateful
had the same problem (PI 4, Bullseye).
what helped was:
sudo raspi-config
1. System options
S2. Audio
Select HDMI Option (In my case it was HDMI 2, because default output was on HDMI 1)
After that default output worked. Everything else (changing config.txt or alsa.conf) wasn't necessary in my case.
reference: Raspberry Pi Configuration
At the time I tried to get into the configuratioms via
sudo raspi-config
as #reallyATypo suggested, but it seemed to be broken somehow. Even navigating into it on the desktop environment didn't work...
Since there were those weird things going on with the os and there wasn't much data on the RasPi anyways, I decided to set up an entirely new os on the same SD card. On there the default configurations seemed to be already correct and there also was no problem anymore with the os.
Thank you all for answering :)
Idk if that is the right thing to do, but I'm gonna set my answer as a workig one in order to close the issue
I have created an ambilight clone as a personal learning project. It uses USB webcam to capture required RGB data from TV; it is currently fully functional. My problem is that my RasPi is currently headless and I would like to be able to show stuff on HDMI output; e.g. fill the screen with RGB(0,255,0) which is used for finding the TV screen from the webcam image.
Currently, I am using a class called Hdmi. It uses PyGame surfarray which allows to show NumPy arrays on framebuffer. This works just fine when running the code using "sudo $(which python) webcambilight.py". But when using system, apparently there is no surface. It doesn't give any errors; it just stops running. If I remove the Hdmi class, everything works.
This narrows the problem down to this piece of code:
os.putenv('SDL_FBDEV', '/dev/fb0')
os.putenv('SDL_VIDEODRIVER', 'fbcon')
pygame.display.init()
I am not very experienced with Linux, but my understanding is that there is no fb0 when running from systemd? I know that services are usually for things that run in the background. But this is a special case. I want to launch the service automatically when the device is turned on. Ideally I would never have to log into the device using SSH.
Based on another question/answer on StackOverflow, I've tried this code in .service file.
Environment="DISPLAY=:0"
Environment="XAUTHORITY=/home/pi/.Xauthority"
This didn't help, which I think is because I have no display. Fb0 and fbcon are not really on display=:0, right?
My .service file's contents are currently:
[Unit]
Description=Webcambilight
After=network.target
[Service]
Type=idle
WorkingDirectory=/home/pi/webcambilight
ExecStart=/home/pi/.virtualenvs/py3cv4/bin/python -u webcambilight.py
[Install]
WantedBy=multi-user.target
NOTE! This works just fine when Hdmi is not on use. But I would love to use it. Now, if I accidentally move my webcam or my TV, I will have to either open YouTube using TV and play a video of greenscreen and then press calibration button on my raspberry (which is a GPIO push button).
What I would want to do is change input to HDMI 4, which is connected to RasPi. Then, by pressing the GPIO calibration button, my Hdmi class would fill the whole 1920x1080 framebuffer with (0,255,0).
Sooo.. any ideas on accessing framebuffer (/dev/fb0?) while running the systemd service?
The full code of Hdmi class is at: https://github.com/sourander/webcambilight/blob/master/wambilight/hdmi.py
Apparently, this fixes it. The systemd is sending hangup signal for reasons beyond my Linux-knowledge.
import signal
def handler(signum, frame):
pass
signal.signal(signal.SIGHUP, handler)
I have a script that parses a bunch of events from a given movie file and uses the -ss and -endpos flags to play specific portions of the file. Let's say there is one file available. What I want to do is run a chained command like:
mplayer vidfile.mp4 -ss 110 -endpos 5 -fs;mplayer vidfile.mp4 -ss 130
-endpos 9
The idea is to have both sections of the video play in full screen, sequentially. However, there is major flicker between the two videos as it exits to the desktop briefly. (Ubuntu 12.04.)
How can I keep mplayer open or make the switch less jarring? Doing this with 30 clips would give someone a massive headache, and I'd like to see if it can be done this way without writing a script to cut the movies and put them together (which sounds like a nightmare, but if anyone has ideas, please post those in the comments).
Not sure when it was introduced, but mplayer now supports this via the -fixed-vo parameter, from the man page:
-fixed-vo
Enforces a fixed video system for multiple files (one (un)initialization for all files).
Therefore only one window will be opened for all files.
Currently the following drivers are fixed-vo compliant:
gl, gl_tiled, mga, svga, x11, xmga, xv, xvidix and dfbmga.
in my testing this works well with full screen (-fs), no flicker.
example usage:
mplayer -fs -fixed-vo thats_marvellous.mp4 cliff.mp4
I ended up solving it with a quick solution that a friend suggested - take the semi-colon out since mplayer supports multiple files AND flags. (Works for a few files for now, will update if I run into problems down the line.)
mplayer vidfile.mp4 -ss 110 -endpos 5 -fs vidfile.mp4 -ss 130 -endpos 9 -fs
Please help, I need to play wave files in X11. Is there any API in X11 like PlaySound in windows? Thanks in advance
You can use aplay, mplayer, vlc, mpg123, mpg321, etc.
For raw (non-mp3, non-compressed) wav files, simple cat should work:
cat file.wav > /dev/dsp
If you need API way to do it, this is simple example how to play wav file using ALSA API.
However, this has nothing to do with X11.