want to run a video file from python? - cron

Want to run the following code through python every 5 min ?
--> bash -c "display=:0 cvlc vish.mp4"
If i run this command directly its working good but how to use it in a python file ?
any suggestions please.
thank you

Perhaps APScheduler will help : https://pypi.python.org/pypi/APScheduler. You can use cron-style (periodic) scheduling or at-style (delayed), for example.

Related

Having issues with a shell command in Python

I am currently studying Python. In my course I have got to reading files. My tutor is showing me the following syntax....
!cat data_file/sample.txt
When he executes this command on screen it lets him view the text contents of the file.
When I execute the code it gives me an error of cat not recognized as a valid shell command in Python.
I have read through as much Python documentation as I can and have come up with nothing!
Can anyone please help??
Its not a python specific functionality. Please use Jupyter (Ipython).
Your tutor is probably using jupyter, which in turn uses IPython.
The bang (!) means that Ipython will execute a system shell command, like:
!cat data_file/sample.txt
In this case output the contents of a txt file. This functionality is documented here

Command line script with Python 3 and argparse must have an argument?

Everyone,
I'm creating a command line script based on Python 3.5 on Ubuntu 16.04.
This script accepts some options that I'm treating with argparse, but the most important one is just a URL which I don't want to treat with an argument.
This is the way that I want it to work:
<command> <url> --<optional_argument_1> <value_1> ... --<optional_argument_N> <value_N>
And not like this one:
<command> --<url_argument> <url> ....
Is there a way to this with argparse?
Turns out that there are a kind of argument called "positional arguments" and it solved my problem! :)
This is the tutorial that I followed to achieve what I wanted.

Can I run a script that uses 2 different command lines?

Sorry if the title is vague, I am fairly new to Linux and I don't really know how else to put it. I am creating a script and when I run it, I got it to run Sage but after it does so, the next command isn't executed. I presume this is because the first couple were in the standard Terminal (bash?) and everything after ./sage isn't -- here's the script:
#!/bin/bash
cd /home/alex/Desktop/sage-7.6
./sage
#I also tried wait ${!} here but it didn't work
notebook("/home/alex/Desktop/sage-7.6/projects/zero forcing.sagenb")
How might I enter the last command in Sage after it opens (assuming it's possible)? Thanks!
Edit: Here's a picture of my problem. Sage runs but I can't get it to execute the notebook() command after it opens.
You need to run notebook() as sage code using the -c option mentioned [ here ]. Try the below code.
#!/bin/bash
/home/alex/Desktop/sage-7.6/sage # You can run the interactive shell directly
# At this point you have completely exited the sage interactive shell
# Presumably you want to run the below 'notebook()' after every interactive shell
# In that case do
/home/alex/Desktop/sage-7.6/sage -c 'notebook("/home/alex/Desktop/sage-7.6/projects/zero forcing.sagenb")'
I think what you really want is just to have one command that launches a notebook with a given name.
It turns out that in many Linux/Unix applications, there is automatic help at the command line. Try
/home/alex/.../sage -n -h
to get some help on the notebook. In particular,
sage -n -h --notebook=sagenb
gives a very, very long list of options, the first of which shows that
sage --notebook=sagenb directory=tp
will give you a new sage notebook server in the directory tp.sagenb.
All this said, I should also point out that the sagenb (sadly) is slowly becoming a legacy project in favor of the Jupyter notebook. In Sage 8.0 a conversion from sagenb to Jupyter will become the default, and even now you can just do
sage --notebook=jupyter --notebook-dir=/home/foo/bar
for that to start up.

Running a bash job within a python script

I was hoping for some advice on using the subprocess module.
I'm trying to run a bash job within a python script so my bash command (in the right directory) is: ./program myjob.inp
This is just running the executable "program" with myjob.inp being the input file (and my python script constantly updates myjob.inp).
I know that if I just wanted to run "program", I could do something like:
with open("tmp.dat","w") as fstore_tmp:
subprocess.call(["./program"], stdout = fstore_tmp)
However, I can't figure out how to run the job taking in the input file myjob.inp such that it's doing the equivalent of ./program myjob.inp. I tried:
with open("tmp.dat","w") as fstore_tmp:
subprocess.call(["./program", "myjob.inp"], stdout = fstore_tmp)
However, that doesn't seem to be working. Does anyone have any suggestions? Thanks!

pdf2swf not working with crontab

I have a python program (queue.py) that convert PDF files to SWF files. It uses pdf2swf tool to achieve that. Now my problem is when I start this program manually by "python queue.py", it works fine. But the same program when I started using crontab.
Result shows
pdf2swf -o "/var/www/code_repository/younus/staff/../staffdocs/AIK/doc/248515566214636_IMPROVEMENT_IN_ELECTRIC_LIGHTS.swf" "/var/www/code_repository/younus/staff/../staffdocs/AIK/doc/248515566214636_IMPROVEMENT_IN_ELECTRIC_LIGHTS.pdf">/dev/zero
"sh: pdf2swf: not found"
Inside crontab -e
* * * * * python /home/francis/myjobs/queue.py
Any help will be greatly appreciated.
Thank you.
You need to have the binary in your PATH, or specify an explicit path name. You may also need to set up the Python library path etc to match the environment of your interactive shell.

Resources