How to submit multiple commands to AWS Batch using Boto3? - python-3.x

I'm trying to run multiple shell commands through Docker using AWS Batch and boto3. When I try to submit multiple commands using the & sign as follows, the job fails.
My attempt
import boto3
client = boto3.client("batch")
response = client.submit_job(
jobName='AndrewJob',
jobQueue='AndrewJobQueue',
jobDefinition='AndrewJobDefinition',
containerOverrides={
'command': 'ls & python myjob.py'.split(),
},
timeout = {'attemptDurationSeconds': 100}
)
print(response)
The error is:
ls: cannot access '&': No such file or directory
According to the Docker Docs here https://docs.docker.com/engine/reference/builder/#cmd and this post here docker run <IMAGE> <MULTIPLE COMMANDS> it seems like this should be possible in shell form.

It appears that Batch is behaving like subprocess.Popen in that it executes the command as one command where the first argument is the command name and the remaining arguments are the command arguments. I got this to work with subprocess.Popen, so I bet it would work with Batch:
subprocess.Popen(["/bin/bash", "-c", "ls && echo \"Hello world\""])

Related

Running commands in server through local python script

I would like to run a batch of bash commands (all together) in a server shell through a python3 script in my local machine.
The reason why I'm not running the python3 script on my laptop is that I can't create the same environment on the server and I want to keep the settings I have on my machine while executing the script.
What I would like to do is:
-Run python commands locally
-Run at a certain point those command on the server
-Wait for the end of server execution
-Continue running python script
(This will be done in a loop)
What I'm trying is to put all the commands in a bash script ssh_commands.sh and use the following command:
subprocess.call('cat ssh_commands.sh | ssh -T -S {} -p {} {}'.format(socket, port, user_host).split(),shell=True)
But when the execution of the script reaches that line get stuck until subprocess.call timeout. The execution of the script anyway won't take that much. The only way to stop the script before is through Ctrl+C
I've also tried to set up the ssh connection in the ~/.ssh/config file but I'm getting the same result.
I know that ssh connection works fine and if I run ssh_commands.sh on the server manually, it runs without any problem.
Can somebody suggest:
- A way for fixing what I'm trying to do
- A better way for achieving the final result written above
- Some debugging way to find out what could be the problem
Thank you in advance
To expand on my comment - and I haven't tested your specific case with ssh, could be there are other complications there). This is actually copy/pasted from my own code in a situation that I already know works.
from subprocess import Popen, PIPE, DEVNULL
from shlex import split as sh_split
proc1 = Popen(sh_split(file_cmd1), stdout=PIPE)
proc2 = Popen(file_cmd2, shell=True, stdin=proc1.stdout, stdout=PIPE)
proc1.stdout.close()
I have a specific reason to use shell=True in the second, but you should probably be able to use shlex.split there too I'm guessing.
Basically you're running one command, outputting to `PIPEĀ“, then using this as input for the second command.

How to perform multiple windows command in cmd using python in same shell

I want to perform some couple of command one after one and store into variable in same shell. Whenever I try to perform next command it executes in new shell
import subprocess
cmd1 = 'cd C:\\Program Files (x86)\\openvpn\\bin\\'
output = subprocess.getoutput(cmd1) # it goes to the above directory
cmd2 = 'openvpn.exe --help'
output2 = subprocess.getoutput(cmd2)
At the cmd2 when it runs,a new shell perform this command and tells--
'openvpn.exe' is not recognized as an internal or external command,
operable program or batch file.
I want to perform couple of commands one after another and store into variables. So I can use that variables in other commands.
You should use the run method, like so:
output = subprocess.run(['openvpn.exe', '--help'], cwd='C:\\Program Files (x86)\\openvpn\\bin\\', capture_output=True)
cwd = current working directory (where should the command run)
capture_output = record the stdout, stderr streams
Then you can access your results within the stdout, stderr properties:
output.stdout # will give you back the output of the command.
You weren't getting any results because the cd command has no effect within subprocess. It has to do with the way cd works in the first place - no process can change another processes working directory.

Chaining bash scripts on gcloud shell with gsutil

I built a series of bash scripts to run BigQuery jobs for a data pipeline. These scripts are saved in a google cloud storage bucket. I pipe them to sh using this:
gsutil cat gs://[bucket]/[filename] | sh
Basically there is no problem if I run this from command line, but once I try running this command from within a bash script, I keep getting file not found errors?
It doesn't seem like a PATH issue (I may be mistaken) as calling $PATH from within the script shows where gsutil is located.
Is this a permissions issue?
I'm running this from within google cloud console shell in my browser. Any help is appreciated.
To start, try printing out the output (both stdout and stderr) of the gsutil cat command, rather than piping it to sh. If you're receiving errors from that command, this will help shed some light on why sh is complaining. (In the future, please try to copy/paste the exact error messages you're receiving when posting a question.)
Comparing the output of gsutil version -l from both invocations will also be helpful. If this is an auth-based problem, you'll probably see different values for the config path(s) lines. If this is the case, it's likely that either:
You're running the script as a different user than who you normally run gsutil as. gcloud looks under $HOME/.config/gcloud/... for credentials to pass along to gsutil... e.g. if you've run gcloud auth login as Bob, but you're running the script as root, gcloud will try looking for root's credentials instead of Bob's.
In your script, you're invoking gsutil directly from .../google-cloud-sdk/platform/gsutil/, rather than its wrapper .../google-cloud-sdk/bin/gsutil which is responsible for passing gcloud credentials along.

how to execute shell script from soapui groovy script?

I have shell script which is kept on remote server(linux machine) and I am trying to call that shell script in between the execution of various test cases of SOAPui from windows.
So I have prepared a groovy script:
def command="/usr/bin/ssh -p password username#IP_address bash -s < /home/test.sh"
def proc=command.execute().text
proc.waitFor()
But unfortunately, I am receiving an error:
java.io.IOException: Cannot run program "/usr/bin/ssh": CreateProcess error=2, The system cannot find the file specified error at line: 6
I tried to search more on this, but couldn't get the resolution. Some of the links were:
How to execute shell script using soapUI
http://groovy-lang.org/groovy-dev-kit.html#process-management
If as you comment you've a putty.exe installed on Windows you can try with the follow.
First of all create a file in your Windows local with the commands to execute remotely for example I create the follow C:/temp/commandsToRunRemotely.txt then in this file put the command you want to execute. As a sample I use the follow command:
echo "test remote execution" > /tmp/myfile.txt
Then from groovy script in SOAPUI call putty.exe passing the local file which contains the commands to execute remotely:
def command = "C:/path/to/putty.exe -ssh user#IP -pw pass-m C:/temp/commandsToRunRemotely.text"
def proc = command.execute()
proc.waitFor()
Note that if you've putty.exe in your Windows path, you can simply use putty.exe instead of full path.
This is only an ilustation sample, but if you want to execute a shell script remotely instead of echo "test remote execution" > /tmp/myfile.txt in the commands file use directly the path for your script: /home/test.sh
I get the Putty command line options from this nice answer
Hope it helps,

Lua won't execute bash script

I'm using Lua script with ngnix to generate ISO file.
Lua script is parsing request and it should pass it to genisoimage command.
I've tried with:
local pack_cmd = "genisoimage -V" .. some_other_name
os.execute(pack_cmd)
Command is not successfully executed and as return code I get 3328. I've
tried with absoulte path (/usr/bin/genisoimage and /bin/genisoimage) but
it's not working.
I've tried simple workaround - Execute genisoimage command inside bash script
and in Lua script run it like this:
local pack_cmd = "bash /absoulte/path/script.sh " .. some_other_name
os.execute(pack_cmd)
Still not working and getting same exit code. Also tried to catch what's wrong but it look likes command genisoimage is never executed.
local pack_cmd = "bash /absoulte/path/script.sh " .. some_other_name .." >> error.log"
os.execute(pack_cmd)
Version with handles is not working as well
local handle = io.popen(pack_cmd)
local result = handle:read("*a")
handle:close()
If I execute pack_cmd string manually everything works OK. Executing bash script is also working.
The problem was with permissions for www-data user.

Resources