Run a python script, observe in terminal and save to a file - python-3.x

I'm trying to run a python script in Ubuntu and see the output in the terminal and simultaneously save the output to a file. I already know how to save the output to a .txt file. But when I run this, I don't see anything in terminal. I have to keep reloading the text file to see the output:
import subprocess
import sys
for mode in modes:
log_path = 'Logs/log%s.txt'
for scriptInstance in [1, 2, 3, 4, 5]:
sys.stdout = open(log_path % scriptInstance, 'w')
subprocess.call('python3 main.py',
stdout=sys.stdout, stderr=subprocess.STDOUT, shell=True)

You should check out python logging. You could use a StreamHandler to log to the terminal and use a FileHandler to log to a file.
Check this logging tutorial.

Related

Jupyter Lab - running cell forever with file.close (), print and sys.stdout

I'm not sure but I imagine that there may be issues similar to mine, but I have not found any that has been satisfactory.
When I open my Jupyter Lab and execute a cell as below (code01), it remains with the status of * (see figure01 below) meaning that it is still running the code, but the output of the out1.txt file is printed correctly.
I would like to know if it is normal for the cell to remain running in this circumstances described from code01.
code01:
import sys
file = open('out1.txt', 'a')
sys.stdout = file
print("house")
file.close()
figure01:
Because you redirect the stdout to a file and then close it you are breaking the IPython kernel underneath: there is no way for any stdout to be correctly processed by the kernel afterwards (cannot write to a closed file). You can reproduce it by executing your code in the IPython console rather than in a notebook. To fix this you could rebind the original stdout back:
import sys
file = open('out1.txt', 'a')
stdout = sys.stdout
sys.stdout = file
print("house")
# before close, not after!
sys.stdout = stdout
file.close()
But this is still not 100% safe; you should ideally use context managers instead:
from contextlib import redirect_stdout
with open('out1.txt', 'a') as f:
with redirect_stdout(f):
print('house')
But for this particular case why not to make use the file argument of the print() function?
with open('out1.txt', 'a') as f:
print('house', file=f)

Python script does not print output as supposed

I have a very simple (test) code which I'm running either from a Linux shell, or in interactive mode, and I have two different behaviours I cannot figure out the reason of.
I have a file generated by a Popen call, previously, where each line is a file path. This is the code used to generate the file:
with open('find.txt','w') as f:
find = subprocess.Popen(["find",".","-name","myfile.out"],stdout=f)
(Incidentally, I was trying to build a PIPE originally, namely inputting the output of this command to a grep command, and since I wasn't successful in any way, I decided to break the problem down and just read the file paths from a file, and process them one by one. So maybe there is a common issue that is blocking me somewhere in this procedure).
Since in this second step I wasn't even able to open and process the files by opening the addresses contained in each line of the find.txt file, I just tried to print the file lines out, because for sure they're available in there:
with open('find.txt','r') as g:
for l in g.readlines():
print(l)
Now, the interesting part:
if I paste the lines above into a python shell, everything works fine and I get my outputs as expected
if, on the other hand, I try to run python test.py, where test.py is the name of the file containing the lines above, no output appears in the shell's stdout.
I've tried sys.stdout.flush() to no avail. I've also inserted some dummy print() statements along the way: everything gets printed but what's after the g.readlines() statement.
Here's the full script I'm trying to make work (a pre-precursor of what I'm actually after, tbh).
#!/usr/bin/env python3
import subprocess
import sys
with open('find.txt','w') as f:
find = subprocess.Popen(["find",".","-name","myfile.out"],stdout=f)
print('hello')
with open('find.txt','r') as g:
print('hello?')
for l in g.readlines():
print('help me!')
print(l)
sys.stdout.flush()
output being:
{ancis:>106> python test.py
hello
hello?
{ancis:>106>
EDIT
I've quickly tried the very same lines (but without the call to find, which isn't available) on my python installation in Windows: it works as expected)
Based on that, I've tried to run the simpler code below:
print('hello')
with open('find.txt','r') as g:
print('hello?')
for l in g.readlines():
print('help me!')
print(l)
sys.stdout.flush()
as a script, in Linux - This also works w/o problems.
This should mean that somehow I'm messing things up with the call to Popen... But what?
This is a race condition.
Your call to
find = subprocess.Popen(["find",".","-name","myfile.out"],stdout=f)
is opening another process and running your find command which takes a bit of time to fully execute.
Python then continues on and reaches the reading of the file portion before the command is fully executed and the file is generated.
Want to test it out?
Add a time.sleep(1) just before the opening of the file.
Full test script:
#!/usr/bin/env python3
import subprocess
import time
with open('find.txt','w') as f:
find = subprocess.Popen(["find",".","-name","myfile.out"],stdout=f)
time.sleep(1)
with open('find.txt','r') as g:
for l in g:
print(l)
To block until the process is complete you can use find.communicate().
With this you can also optionally set a timeout if that's something that you want.
#!/usr/bin/env python3
import subprocess
with open('find.txt','w') as f:
find = subprocess.Popen(["find",".","-name","myfile.out"],stdout=f)
find.communicate()
with open('find.txt','r') as g:
for l in g:
print(l)
Source:
https://docs.python.org/3/library/subprocess.html#subprocess.Popen.communicate

Exit interactive prompt inside script and move to next line

Hey Im trying to print output of an interactive command to a file inside a python script and move on to next line.
I am not sure how to achieve this. I have tried:
os.system("mnamer foo.mkv > mnamer.txt")
FYI mnamer can be imported and called from inside the script with "mnamer"
the above command logs the info I need in a file but I need it to move past the prompt and read the next line of code.
Is there a python specific way of doing this?
If you can import mnamer as a python module, do that, use it this way, and log its outputs to a file by temporarily assigning sys.stdout and sys.stderr to a file:
import mnamer
import sys
logfile = open("/path/to/log/file.txt", "w") # open the logfile
stdout, stderr = sys.stdout, sys.stderr # make copies of these to be able to restore them after the mnamer commands
sys.stdout = logfile # assign stdout to logfile, infos will be written there
sys.stderr = logfile # assign stderr to logfile, errors will be written there too
# put your mnamer commands here
mnamer.some_method_of_mnamer_module()
sys.stdout = stdout # restore stdout so that further infos will be printed to terminal again
sys.stderr = stderr # restore stderr so that further errors will be printed in terminal again
logfile.close() # close the logfile

using python to open many idle windows

i have this layout
1/mainy.py
2/main.py
3/main.py
........
i wish to run each "main" in its own idle window. not a cmd line because if it crashes i tend to lose the output.
so far i have
for i in range(150):
i+=1 # because theres no zero folder
exec(open(str(i)+"/"+'main.py').read()) # if i run this in idle it tries to run them in the same idle window
i want to have many different idle windows simultaneaously. right now i have to open each manually but i want a script to do it.
i very specfically want each one running in its own idle window so i should have 8 (9 including the one that opens the rest) windows open.
If you don't need to collect output from each idle, you can use subprocess.Popen
from subprocess import *
import os
#Loop over and change script name
for i in range(150):
script_name = os.path.join(str(i), "main.py")
print('script launching:', script_name)
subprocess.Popen("start python " + script_name, stdin=PIPE, stderr=PIPE, stdout=PIPE, shell=True)
You can open each Python script in separate window using this:
import subprocess
import time
import pyautogui
for i in range(150):
i+=1 # because theres no zero folder
subprocess.call([ 'C:/Users/tgmjack/AppData/Local/Programs/Python/Python37-32/Lib/idlelib/idle.bat', str(i)+"/"+'main.py'])
time.sleep(2)
pyautogui.press('f5')
time.sleep(2)
Change the first path to match where idle.bat is located and the second according to your Python script.
You can loop through the scripts but you have to use PyAutoGUI package to press F5 to run each script programmatically.
Refer to https://pypi.org/project/PyAutoGUI/

Program ran from subprocess.run() in python3, cannot create files

I have a program that is supposed to create a text file. When called from subprocess.run() in python3, the program runs but it does not create the text file. The program works as expected when called from the terminal.
import subprocess as subp
...
comm=[os.getcwd()+'/test/myprogram.bin','arg1','arg2']
compl_proc = subp.run(comm,
capture_output=True,
text=True,
check=True)
The file was in the python script directory, because I never told subprocess.run() what is the current working directory of the subprocess. So cwd='...' is added.
import subprocess as subp
import os
...
comm=[os.getcwd()+'/test/myprogram.bin','arg1','arg2']
compl_proc = subp.run(comm,
cwd=os.getcwd()+'/test/',
capture_output=True,
text=True,
check=True)

Resources