How to efficiently execute thousands of different PowerShell commands from Python - python-3.x

The easiest way to execute PowerShell commands using python is invoking a cmd command using:
Example: powershell -c "Get-VM"
But, this method is inefficient when being used thousands of times, since each call invokes a PowerShell session so it takes a few precious seconds (times a few thousands it may sum to hours).
What I would like to achieve is to reduce this execution time by having PowerShell session open in the background. I have explored a few methods and I would like to know if there are other possibilities.
The best method I found so far (working, but seems like a somewhat "dirty" solution)
Having a PowerShell session listening to a certain file.
If the file exists, the PS session will immediately execute the command in that file and then delete this file, and store the output in a separate file.
Python will handle creating and deleting the file and reading the output file.
Few things:
Output is mandatory
Commands have to be able to be executed on separate time periods and cannot be executed together.
The entire point is to eliminate the few seconds it takes to call powershell -c "...", so what ever the solution is, the command execution has to eliminate that issue.
Do you think there a better way?

Related

Want to store a value in local ../usr from shell script

I just want to store some values while running shell script ,
scenario : if im running shell script it will do some operation and it will store the results/activity done.
then again I'm running the same script I should identify these are executed and you can continue from here . some what I need . how to do that? can we use .lock file or else any other best ways are there?
I just want to store some values while running shell script , how to do that? can we use .lock file or else any other best ways are there?
.lock files are by convention used to identify running services and I would therefor vote against it.
It just sounds like you want to keep track of your progress.
If you do not mind the data being erased post reboot I'd suggest you simple use /tmp for that (this remains in memory), do mind that if we are talking very large amounts this will drain your available mem.
Without knowing your use case it's hard to tell you what is the best solution.
But I would suggest writing an empty file that just indicates that your script is in progress(very similar to lock behaviour) and a second file that just keeps track of what items you processed.
Then just loop over the items and skip until you hit a 'new' item.
If we are talking very large amounts you should consider using a local database or database server.

Running script twice at time

I'm making a little simple script to improve the efficienty of my work team.
The script simply searches a file that the user gives as param.
./check_file test_file.xml
I used only ls and cp commands and there's no log or temporary files.
My question is: should I put a .lock file to be sure that the script runs only once at time or can I avoid this control?
Usually I create a lock file, because my scripts write temporary files and if two users run at the same moment the script, it explodes.
Thanks!
Generally speaking, no. I would recommend avoiding temporary files as much as possible, preferring pipes instead. However, I doubt it's always possible to avoid temporary files, so when I have to, I use $$ in the filename (current process ID or PID). So if you're using /tmp/check_file.tmp as your temporary filename, instead use /tmp/check_file.$$.tmp - then two processes can run at a time, each with their own PID, and not overlap.
Slightly more advanced is to also use ${TMP:-/tmp} as the temporary directory instead of just /tmp - that way users can specify a different directory for each run, and thereby also avoid any overlaps.

When running a shell script with loops, operation just stops

NOTICE: Feedback on how the question can be improved would be great as I am still learning, I understand there is no code because I am confident it does not need fixing. I have researched online a great deal and cannot seem to find the answer to my question. My script works as it should when I change the parameters to produce less outputs so I know it works just fine. I have debugged the script and got no errors. When my parameters are changed to produce more outputs and the script runs for hours then it stops. My goal for the question below is to determine if linux will timeout a process running over time (or something related) and, if, how it can be resolved.
I am running a shell script that has several for loops which does the following:
- Goes through existing files and copies data into a newly saved/named file
- Makes changes to the data in each file
- Submits these files (which number in the thousands) to another system
The script is very basic (beginner here) but so long as I don't give it too much to generate, it works as it should. However if I want it to loop through all possible cases which means I will generates 10's of thousands of files, then after a certain amount of time the shell script just stops running.
I have more than enough hard drive storage to support all the files being created. One thing to note however is that during the part where files are being submitted, if the machine they are submitted to is full at that moment in time, the shell script I'm running will have to pause where it is and wait for the other machine to clear. This process works for a certain amount of time but eventually the shell script stops running and won't continue.
Is there a way to make it continue or prevent it from stopping? I typed control + Z to suspend the script and then fg to resume but it still does nothing. I check the status by typing ls -la to see if the file size is increasing and it is not although top/ps says the script is still running.
Assuming that you are using 'Bash' for your script - most likely, you are running out of 'system resources' for your shell session. Also most likely, the manner in which your script works is causing the issue. Without seeing your script it will be difficult to provide additional guidance, however, you can check several items at the 'system level' that may assist you, i.e.
review system logs for errors about your process or about 'system resources'
check your docs: man ulimit (or 'man bash' and search for 'ulimit')
consider removing 'deep nesting' (if present); instead, create work sets where step one builds the 'data' needed for the next step, i.e. if possible, instead of:
step 1 (all files) ## guessing this is what you are doing
step 2 (all files)
step 3 (all files
Try each step for each file - Something like:
for MY_FILE in ${FILE_LIST}
do
step_1
step_2
step_3
done
:)
Dale

Automatic Background Perl Execution on Ubuntu

I've been troubleshooting this issue for about a week and I am nowhere, so I wanted to reach out for some help.
I have a perl script that I execute via command like, usually in a manner of
nohup ./script.pl --param arg --param2 arg2 &
I usually have about ten of these running at once to process the same type of data from different sources (that is specified through parameters). The script works fine and I can see logs for everything in nohup.out and monitor status via ps output. This script also uses a sql database to track status of various tasks, so I can track finishes of certain sources.
However, that was too much work, so I wrote a wrapper script to execute the script automatically and that is where I am running into problems. I want something exactly the same as I have, but automatic.
The getwork.pl script runs ps and parses output to find out how many other processes are running, if it is below the configured thresh it will query the database for the most out of date source and kick off the script.
The problem is that the kicked off jobs aren't running properly, sometimes they terminate without any error messages and sometimes they just hang and sit idle until I kill them.
The getwork script queries sql and gets the entire execution command via sql concatanation, so in the sql query I am doing something like CONCAT('nohup ./script.pl --arg ',param1,' --arg2 ',param2,' &') to get the command string.
I've tried everything to get these kicked off, I've tried using system (), but again, some jobs kick off, some don't, sometimes it gets stuck, sometimes jobs start and then die within a minute. If I take the exact command I used to start the job and run it in bash, it works fine.
I've tried to also open a pipe to the command like
open my $ca, "| $command" or die ($!);
print $ca $command;
close $ca;
That works just about as well as everything else I've tried. The getwork script used to be executed through cron every 30 minutes, but I scrapped that because I needed another shell wrapper script, so now there is an infinite look in the get work script that executes a function every 30 minutes.
I've also tried many variations of the execution command, including redirecting output to different files, etc... nothing seems to be consistent. Any help would be much appreciated, because I am truly stuck here....
EDIT:
Also, I've tried to add separate logging within each script, it would start a new log file with it's PID ($$). There was a bunch of weirdness there too, all log files would get created, but then some of the processes would be running and writing to the file, others would just have an empty text file and some would just have one or two log entries. Sometimes the process would still be running and just not doing anything, other times it would die with nothing in the log. Me, running the command in shell directly always works though.
Thanks in advance
You need a kind of job managing framework.
One of the bigest one is Gearman: http://www.slideshare.net/andy.sh/gearman-and-perl

Run a file multiple times with different parameters

I am working in Windows and at my work i have a VB program that i need to run multiple times. It takes two input files. One of them is constant and the other input file changes according to which a new output file is created every time the program is run.
I need to know how can i automate this. Can this be done using a batch file? I an not sure if the VB program takes cmd inputs. How can I check and what shall I read? I don't have access to its source.
Every program that runs runs in the shell. right? So where can I see that? Maybe I could manipulate and repeat the exe execution using different parameters.
One way is to write a GUI Automation script. Following Stack exchange threads can help you get started with it:
Automate GUI tasks
https://stackoverflow.com/questions/120359/tools-for-automated-gui-testing-on-windows

Resources