force vlc to load ui on raspberry booted to cli - linux

I've been googling for this all week, struggling to find a good solution.
I have a training video kiosk script that I've set up for my company, running on an intel NUC. to me, that feels like overkill, so I'm trying to get the script to run on a pi 3 model b to save a little on costs to deploy this thing.
my script works great from desktop, I've been able to get it to run on startup from the cli, & I can even load the videos with the dummy UI - the problem is that there's no sound, and when the video loads in the cli, it fills the screen with errors, then plays as text, like the picture below.
If I run it from desktop, it's fine (just really jittery)
is there a way to force vlc to load its interface without loading the desktop of raspian?
right now, when I call a video, the terminal line inputs like:
vlc-wrapper <file path> --play-and-exit --fullscreen -Idummy
video image

got it to load the video
from command line:
X & vlc <filepath>
also got it to run from python
import os.subprocess
subprocess.call(['xinit', '--', '/usr/bin/vlc-wrapper, '<filepath>'])

Related

Python script execute with system then serial.write() send data over serial and crash - Raspberry Pi and Arduino

I have script which reading txt file (only numbers inside) and this script starting with system (before Raspbian Gui start). Everything's work fine, I can send text in byte:
def send_serial_stop(self):
self.serialport.write(b'stop')
When I try to run this action - reading txt file and send data over serial, my script crash.
When I start script in Raspbian, everything works fine, and script not crashing. What's wrong with this code? I cannot read any errors because this app is in fullscreen mode. Have some tips?
Please help me, because I should finish this project until end of this week :(
def send_serial(self):
file = open('testprog.txt').read()
self.serialport.write(file.encode())
EDIT
When I use button with connected code (example above), script is terminated, and GUI application is closing.
Application is based on PyQt5 framework and GUI also.
Tere are around 150 lines of code in total.
Txt file has 24 characters (only numbers)
If I trying to run this script in Pycharm, everything works fine, when I start from terminal, situation is the same as during startup.
Probably problem is that you use relative path when opening text file.
Change it to absolute path:
file = open('/absolute/path/to/your/file/testprog.txt').read()

How to display PyGame framebuffer using systemd service?

I have created an ambilight clone as a personal learning project. It uses USB webcam to capture required RGB data from TV; it is currently fully functional. My problem is that my RasPi is currently headless and I would like to be able to show stuff on HDMI output; e.g. fill the screen with RGB(0,255,0) which is used for finding the TV screen from the webcam image.
Currently, I am using a class called Hdmi. It uses PyGame surfarray which allows to show NumPy arrays on framebuffer. This works just fine when running the code using "sudo $(which python) webcambilight.py". But when using system, apparently there is no surface. It doesn't give any errors; it just stops running. If I remove the Hdmi class, everything works.
This narrows the problem down to this piece of code:
os.putenv('SDL_FBDEV', '/dev/fb0')
os.putenv('SDL_VIDEODRIVER', 'fbcon')
pygame.display.init()
I am not very experienced with Linux, but my understanding is that there is no fb0 when running from systemd? I know that services are usually for things that run in the background. But this is a special case. I want to launch the service automatically when the device is turned on. Ideally I would never have to log into the device using SSH.
Based on another question/answer on StackOverflow, I've tried this code in .service file.
Environment="DISPLAY=:0"
Environment="XAUTHORITY=/home/pi/.Xauthority"
This didn't help, which I think is because I have no display. Fb0 and fbcon are not really on display=:0, right?
My .service file's contents are currently:
[Unit]
Description=Webcambilight
After=network.target
[Service]
Type=idle
WorkingDirectory=/home/pi/webcambilight
ExecStart=/home/pi/.virtualenvs/py3cv4/bin/python -u webcambilight.py
[Install]
WantedBy=multi-user.target
NOTE! This works just fine when Hdmi is not on use. But I would love to use it. Now, if I accidentally move my webcam or my TV, I will have to either open YouTube using TV and play a video of greenscreen and then press calibration button on my raspberry (which is a GPIO push button).
What I would want to do is change input to HDMI 4, which is connected to RasPi. Then, by pressing the GPIO calibration button, my Hdmi class would fill the whole 1920x1080 framebuffer with (0,255,0).
Sooo.. any ideas on accessing framebuffer (/dev/fb0?) while running the systemd service?
The full code of Hdmi class is at: https://github.com/sourander/webcambilight/blob/master/wambilight/hdmi.py
Apparently, this fixes it. The systemd is sending hangup signal for reasons beyond my Linux-knowledge.
import signal
def handler(signum, frame):
pass
signal.signal(signal.SIGHUP, handler)

Terminal not clearing after auto start bashscript in raspbian wheezy

I'm using a raspberry pi 2 to show all the video's in a folder. The raspberry automatically boots up (with a generic electric timer) in to console (not the gui) and after it boots it runs a bashscript I found here. This bashscript contains an infinite loop to play all the videos in a folder using omxplayer.
When I boot in to consolemode and manually start the script everything works perfectly. The terminal screen clears, the first video starts, and after it ends there is a second or two of black screen (empty terminal) and the second video starts playing. This is exactly what I want.
However, when I use crontab to start this script (#reboot /path/to/script.sh) the terminal messages stay and it doesn't clear everything between video's.
I've tried creating my own script that first clears everything, and then calls the second script. But this doesn't work.
I'm really really new in this field (but I'm having fun) so any pointers in the right direction would be appreciated!
P.S. I edited the /boot/cmdline.txt file so it doesn't display critical kernal logs as a work-around.
You should not be doing this using cron. You should be changing the inittab so that it runs outside any environment that may be created. See the inittab(5) man page for details. You may also be interested in openvt(1) as well.

no picture with fswebcam

A month ago, I test the following code to take picture with my webcam connected to a Raspberry pi (I used python3).
import os
import datetime
os.system ("fswebcam -d/dev/video0 -r640x480 /home/pi/Documents/%s.jpeg" %datetime.datetime.utcnow().strftime (%Y-%m-%d- -%H-%M-%S"))
It works well and I had no problem but, since a week or so, it doesn't work anymore. The code returns no error message but, there is no picture in the 'Documents' folder. Everything seems to indicate that pictures are taken, but I cannot find them.
I looked for the pictures in the other folders but couldn't find them.
I updated the raspberry pi but it changed nothing.
I tried to run the script 'alone' with the command line
fswebcam -r 640x480 test.jpeg
The picture is taken and appears in the home/pi/ folder.
I tried to run the script as a super user but it opens the help menu of fswebcam. (???)
does one of you have an idea why it does not work anymore, what I did wrong and where are my pictures?
Your script might be changed, you should check it or change to another folder.
Because Raspberry Pi has no realtime, so you should consider this point if you set names of pictures according to the "date", so there might be the new pictures are saved in Documents folder but overwrite old pictures.

Render swf to png or other image format

How can I, on linux, render a swf to a image file?
I need to be able to load other swfs into that swf and run actionscript code.
Is it even possible on linux? I need to do it from PHP, it's fine if I have to use command-line tools.
swfrender from swftools works for basic SWF files.
swfdec-thumbnailer from swfdec-gnome works though it only gets the first frame of the swf.
To get any frame from swf using swfdec see the C code snippet in the following mailing list post.
gnash from gnash also works gnash -s<scale-image-factor> --screenshot last --screenshot-file output.png -1 -r1 input.swf, last image of the swf.
ffmpeg from ffmpeg also works for some swf formats ffmpeg -i movie.swf -f image2 -vcodec png movie%d.png
Also see the following guide for a commandline pipeline.
In order to call external programs from php you use the exec command documented here.
Note that for security reasons it is important to escape arguments passed to exec with another command like escapeshellcmd or escapeshellarg for security reasons.
Once you have converted to an image format whether for single frame or all frame, you can't run action script. Other non GNU / Linux tools support the export of the action script from from SWF.
If the SWF that you are exporting to PNG is too complicated for the other tools than you can use the Flash Plugin or Gnash and Xvfb along with screen capture software to capture either image frames of the SWF or a video format like avi. Then you can extract the images from the video format.
This virtual framebuffer method will support complicated SWF files, though it requires a lot of work as you need to use either Gnash and Xvfb and Screen Capture, or a browser , Xvfb and Selenium, if you want to capture a certain set of mouse / keyboard interactions with the SWF.
Gnash with and without the Virtual FrameBuffer should load the ActionScript before exporting, but may have issues with complicated ActionScript. Flash Plugin with Virtual Framebuffer will load the ActionScript before exporting.
Also see the following StackOverFlow questions, which you question is a duplicate of
Convert SWF to PNG
Render Flash (SWF) frame as image (PDF,PNG,JPG)
SWF to image (jpg, png, …) with PHP
This is the solution I ended up using.
You can use a tool like Xvfb (X11 server) and run the standalone flash player projector inside it (you may need to install a bunch of 32-bit libraries), then use a screen capture utility like import to capture the screen and crop it to size.
I found this page on rendering swf screenshots in linux helpful. It also says that you can use gnash to do this, however gnash won't work for flash player 9+.
Try this air application http://swfrenderer.kurst.co.uk
It render swf frame by frame

Resources