Pass a Python input parameter to a Batch File - python-3.x

Is it at all possible to build a Python GUI (lets say using Tkinter) and then pass the users input from the Python GUI into a windows batch file.
My objective is to make batch files have a nice front end using Python.
Simple example:
In the Python code the user will be asked for a date
date = inputInt("Please enter Date yyyymmdd")
Now I need to put this date value into a windows batchfile.

When running the the Python program you should use a pipe, to redirect it's stdout to stdin of the batch file. In the batch file you can just wait on the stdin until something is outputed by the Python program. Take a look here to see how to read an input stream in batch. It would look something like this:
python myprogram.py | batch_file.bat

I used the following code to send the data to a text file
import sys
startdate = input("Please Enter StartDate YYYYMMDD ")
orig_stdout = sys.stdout
f = open('startdate.txt', 'w')
sys.stdout = f
print(startdate)
sys.stdout = orig_stdout
f.close()
I then used the following in my batch file to read the text file contents
#echo off
set /p startdate=<startdate.txt
echo %startdate%

Related

Why would a python script keep running after the output is generated (strange behavior)?

Background: The purpose of this script is to take eight very large (~7GB) FASTQ files, subsample each, and concatenate each subsample into one "master" FASTQ file. The resulting file is about 60GB. Each file is subsampled to 120,000,000 lines.
The issue: The basic purpose of this script is to output a huge file. I have print statements & time stamps in my code so I know that it goes through the entire script, processes the input files and creates the output files. After I see the final print statement, I go to my directory and see that the output file has been generated, it's the correct size, and it was last modified a while ago, despite the fact that the script is still running. At this point, however, the code has still not finished running, and it will actually stall there for about 2-3 hours before I can enter anything into my terminal again.
My code is behaving like it gets stuck on the last line of the script even after it's finished creating the output file.
I'm hoping someone might be able to identify what's causing this weird behavior. Below is a dummy version of what my script does:
import random
import itertools
infile1 = "sample1_1.fastq"
inFile2 = "sample1_2.fastq"
with open(infile1, 'r') as file_1:
f1 = file_1.read()
with open(inFile2, 'r') as file_2:
f2 = file_2.read()
fastq1 = f1.split('\n')
fastq2 = f2.split('\n')
def subsampleFASTQ(compile1, compile2):
random.seed(42)
random_1 = random.sample(compile1, 30000000)
random.seed(42)
random_2 = random.sample(compile2, 30000000)
return random_1, random_2
combo1, combo2 = subsampleFASTQ(fastq1, fastq2)
with open('sampleout_1.fastq', 'w') as out1:
out1.write('\n'.join(str(i) for i in combo1))
with open('sampleout_2.fastq', 'w') as out2:
out2.write('\n'.join(str(i) for i in combo2))
My ideas of what it could be:
File size is causing some slowness
There is some background process running in this script that wont let it finish (but i have no idea how to debug that-- any resources would be appreciated)

how to create a dynamic list to read and update for each script run

i have a script which runs some checks for all DB's.Now i want to have a list such that this list contains all the DB's already checked.So the next time the script runs it will read this list and if the DB is not in that list only then the checks happen.
What is the best way to implement this? If i initialize a empty list(DB_checked) and append each DB name while the checks are run,the issue would be that each time the script starts the list would again be empty.
Please suggest.Thanks.
At the end of the script will call the below function to write to disk:
def writeDBList(db_checked):
with open(Path(__file__).parent / "db_names.txt", "w") as fp:
for s in job_names:
fp.write(str(s) +"\n")
return
When the script starts will call the below to read the file from disk:
def readDBList():
with open(Path(__file__).parent / "db_names.txt", "r") as fp:
for line in fp:
db_list.append(line.strip())
return
But how to convert the file contents to a list so that i can easily check with below:
checked_list = readDBList()
if db not in checked_list:
....
....
You need to write this list to the disk after the script finishes the checks, and read it again in the beginning of the script in the next script run.
# Read DB CheckList
DB_List = readDBList()
# Your normal script functionality for only DBs not in the list
# Store DB CheckList
writeDBList(DB_List)
Check this in case you are not familiar with I/O file handling in python.
Now, regarding your second question about how to read the list. I would suggest using pickle, which allows you to read/write python structures without worrying about stringfying or parsing.
import pickle
def writeDBList():
with open('DBListFile', 'wb') as fp:
pickle.dump(DBList, fp)
def readDBList():
with open ('DBListFile', 'rb') as fp:
DBList= pickle.load(fp)

Print all print() in log file on exit

I have a script with a print in a while loop.
This loop is running for like 10 hours and print something every minute or so.
I would like to store all print() outputs in a single log file
I don't know how to proceed
I use windows and Python 3.6
print can send the output to a file. Just provide a file descriptor:
for i in range(10): # or a `while` loop
with open('mylog.log', 'a') as f:
print("h0i", file=f)

how to log the execution in python

I have a python script containing a function to sum of two number.I want to create a logfile which logs everything during execution.How should I do that?Could you please explain with some example?
def sum(a,b):
retrun a + b
a = sum(10,20)
You can create a simple log file by,
Opening a file at a known location with proper access mode e.g. filehandle = open(Logfilefullpath, "a+") - Opening in append mode.
Use the write function to log your required information to the file. e.g. filehandle.write("sum function... ")
Close the filehandle to release the file. e.g. filehandle.close()

debugging a python script taking input from sys.stdin in pycharm

I want to debug a small python script that takes input from stdin and sends it to stdout. Used like this:
filter.py < in.txt > out.txt
There does not seem to be a way to configure Pycharm debugging to pipe input from my test data file.
This question has been asked before, and the answer has been, basically "you can't--rewrite the script to read from a file."
I modified the code to take a file, more or less doubling the code size, with this:
import argparse
if __name__ == '__main__':
cmd_parser = argparse.ArgumentParser()
cmd_parser.add_argument('path', nargs='?', default='/dev/stdin')
args = cmd_parser.parse_args()
with open(in_path) as f:
filter(f)
where filter() now takes a file object open for write as a parameter. This permits backward compatibility so it can be used as above, while I am also able to invoke it under the debugger with input from a file.
I consider this an ugly solution. Is there a cleaner alternative? Perhaps something that leaves the ugliness in a separate file?
If you want something simpler, you can forgo argparse entirely and just use the sys.argv list to get the first argument.
import sys
if len(sys.argv) > 1:
filename = sys.argv[1]
else:
filename = sys.stdin
with open(filename) as f:
filter(f)

Resources