Sending Information from one Python file to another - python-3.x

I would like to know how to perform the below mentioned task
I want to upload a CSV file to a python script 1, then send file's path to another python script in file same folder which will perform the task and send the results to python script 1.
A working code will be very helpful or any suggestion is also helpful.

You can import the script editing the CSV to the python file and then do some sort of loop that edits the CSV file with your script 1 then does whatever else you want to do with script 2.
This is an advantage of OOP, makes these sorts of tasks very easy as you have functions set in a module python file and can create a main python file and run a bunch of functions editing CSV files this way.

Related

Open .exe file using python and trying to pass parameter in the same python script

I'm trying to open a .exe file from Python and give it some instructions. Because I have thousands of models to run I need to automatize the process.
Here on Stackoverflow, I found several options that I tried. I am able to open .exe file but not able to fill the information in that and run the .exe. the place is always empty. I'm writing one of these solutions. [I'm using Python3]:
import os
your_bat_file_address = r'"C:\D_drive\testing\SAR\1100_star3\exmple.bat"' # example
os.startfile(your_bat_file_address)
in the exmple.bat file
"C:/D_drive/tool/EXMPLE.exe" --input1 "C:/D_drive/file/1st_file" --input2 "C:/D_drive/file/2st_file" --input3 "C:/D_drive/file/3st_file"

Is there a method to upload batch file scripts into python(and be able to work with them)

Python has extended libraries and one of the most useful keywords in python is import. Is there a way to upload batch file scripts into one of your libraries or Is there way to import them straight from batch file?

Importing a whole folder of python files

In the current python program I'm working on, I need to access a lot of stored data. I store it in the form of a bunch of dictionaries, each in their own file. Each file has a single command: giveArchive(). So to access one of the files, I use:
import fileName
return fileName.giveArchive()
And this has worked well so far, but as the number of files I need grows, I want to streamline this a little bit. I'd like to store all of these files in the same folder, and that folder in the same directory as my main file. Is there some way I can import every file in a folder? And if I do, how can I use 'giveArchive()' from specific files in it?
You can do something like:
from folder.subfolder.deepersubfolder import filename
return filename.giveArchive()
this assumes folder can be accessed from the directory your script is running in

Download and rename a number of files simultaneously using Python

I have written a script in python3 that opens a website, logs into it, enter numbers by running through a list
list = [1,2,3,4,5,6,7,8,9,10,11]
for j in list:
enternum = driver.find_element_by_xpath('xxxx')
enternum.click()
enternum.send_keys(j)
and downloads a file after entering each number.
I want to rename the file the moment it is downloaded before downloading the next file. For example:- File downloaded for 1 will be renamed as 1, for 2 as 2 and so on. I have tried using shutil and os.rename but was not successful.
Is there a way I can do this using Python? Any help will be appreciated
I was able to change the file names after downloading them, by using os.path.join and then using os.rename

Python3x - Write a python script to run other python scripts?

I have a number of python scripts that I would like to automate using Python's Datetime and Schedule module.
They are too numerous to consider breaking apart and adding to one large file.
What is the easiest way to write a python script that will open and run these other python scripts?
I have browsed similar questions but none offered a concrete answer that I could find. Thanks for your help.
A minimally demonstrative example
In a file called "child.py", write a file to the current directory:
with open('test', 'w') as f:
f.write('hello world')
Then, in a file called "parent.py", execute the "child.py" script:
import subprocess
subprocess.call(['python', 'child.py'])
Now, from your command line, you can type (assuming both "parent.py" and "child.py" are in the current directory):
python parent.py
In the next instant, you should see a file called "test" in your current directory. Open it up. What do you see?
Well, hello world of course!
The above example makes a child of the current process (meaning it inherits the environment variables in the parent), and waits until the child process completes before returning control to the parent. If you want the child script to run in the background, then you need to use Popen:
subprocess.Popen(['python', 'child.py'])

Resources