This question already has answers here:
subprocess.call() arguments ignored when using shell=True w/ list [duplicate]
(2 answers)
Closed 4 years ago.
I have a config_file.yml file like this:
sample:
sql: "select * from dbname.tableName where sampleDate>='2018-07-20';"
config: {'hosts': [!!python/tuple ['192.162.0.10', 3001]]}
sample2:
sql: "select * from dbname.tableName where sampleDate<='2016-05-25';"
config: {'hosts': [!!python/tuple ['190.160.0.10', 3002]]}
My python code is:
data_config = yaml.load(config_file)
for dataset, config in data_config.items():
args = [config]
cmd = ['./execute_something.sh']
cmd.extend(args)
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True).communicate()
execute_something.sh:
#!/bin/bash
echo $1
data_config=$1
echo $data_config
So basically I want to pass {'sql': "select * from dbname.tableName where sampleDate>='2018-07-20';", config: {'hosts': [!!python/tuple ['190.160.0.10', 3002]]}} this entire string as an argument to the shell script.
Problem:
1) select * ends up listing all files in the current directory, instead of being passed entirely as a string
2) even if I pass a simple string say args="hi" it still won't work!
I don't understand what I am missing here. Kindly help.
Thanks!
DO NOT USE shell=True.
data_config = yaml.load(config_file)
for dataset, config in data_config.items():
cmd = ['./execute_something.sh', str(config)]
p = subprocess.Popen(cmd, stdout=subprocess.PIPE).communicate()
When you run shell=True, you're prepending sh -c to your literal argument list. What that does in this case is the following (escaping is added to make the single quotes literal):
sh -c './execute_something.sh' '{'"'"'sql'"'"': "select * from dbname.tableName where sampleDate>='"'"'2018-07-20'"'"';", config: {'"'"'hosts'"'"': [!!python/tuple ['"'"'190.160.0.10'"'"', 3002]]}}'
That doesn't work. Try it manually in a shell, if you like. Why? Because the argument starting with the { isn't passed to ./execute_something.sh, but is instead passed to the shell executing sh -c.
What would work, if you really insisted on keeping shell=True? Compare to the following:
sh -c './execute_something.sh "$#"' _ '{'"'"'sql'"'"': "select * from dbname.tableName where sampleDate>='"'"'2018-07-20'"'"';", config: {'"'"'hosts'"'"': [!!python/tuple ['"'"'190.160.0.10'"'"', 3002]]}}'
Here, the argument just after the -c is a shell script that looks at its arguments, and passes those arguments on to execute_something.sh.
Related
I want to execute an external program in lua. Usually this can be done with
os.execute("run '"..arg0.."' 'arg1' arg2")
The problem with this approach is if I want to pass user input as string to an external program, user input could be '; evil 'h4ck teh system' ' and the script from above would execute like this:
/bin/bash -c "run ''; evil 'h4ck teh system' '' 'arg1' arg2"
Another problem occurs when I have '$var' as argument and the shell replaces this with its environment variable. In my particular case I have something like [[program 'set title "$My Title$"']] – so nested strings – and program parses "$My Title$" (with escape sequences) differently than '$My Title$' (as it is). Because I want to set the title as it, the best way is to have arguments like this: 'My Title'. But now the command have to be:
os.execute([[run "set title '$My Title$'"]])
But now – as I said – $My will be replaced with an empty string, because the environment does not know any variable named $My and because, I never wanted it to be replaced.
So I am looking for the usual approach with
execv("run", {"set title '"..arg0.."'", arg1, arg2})
local safe_unquoted = "^[-~_/.%w%%+,:#^]*$"
local function q(text, expand) -- quoting under *nix shells
-- "expand"
-- false/nil: $var and `cmd` must NOT be expanded (use single quotes)
-- true: $var and `cmd` must be expanded (use double quotes)
if text == "" then
text = '""'
elseif not text:match(safe_unquoted) then
if expand then
text = '"'..text:gsub('["\\]', '\\%0')..'"'
else
local new_text = {}
for s in (text.."'"):gmatch"(.-)'" do
new_text[#new_text + 1] = s:match(safe_unquoted) or "'"..s.."'"
end
text = table.concat(new_text, "\\'")
end
end
return text
end
function execute_commands(...)
local all_commands = {}
for k, command in ipairs{...} do
for j = 1, #command do
if not command[j]:match"^[-~_%w/%.]+$" then
command[j] = q(command[j], command.expand)
end
end
all_commands[k] = table.concat(command, " ") -- space is arguments delimiter
end
all_commands = table.concat(all_commands, ";") -- semicolon is commands delimiter
return os.execute("/bin/bash -c "..q(all_commands))
end
Usage examples:
-- Usage example #1:
execute_commands(
{"your/program", "arg 1", "$arg2", "arg-3", "~/arg4.txt"},
{expand=true, "echo", "Your program finished with exit code $?"},
{"ls", "-l"}
)
-- The following command will be executed:
-- /bin/bash -c 'your/program '\''arg 1'\'' '\''$arg2'\'' arg-3 ~/arg4.txt;echo "Your program finished with exit code $?";ls -l'
$arg2 will NOT be expanded into value because of single quotes around it, as you required.
Unfortunately, "Your program finished with exit code $?" will NOT be expanded too (unless you explicitly set expand=true).
-- Usage example #2:
execute_commands{"run", "set title '$My Title$'", "arg1", "arg2"}
-- the generated command is not trivial, but it does exactly what you need :-)
-- /bin/bash -c 'run '\''set title '\''\'\'\''$My Title$'\''\'\'' arg1 arg2'
I'm looking for a standard tool capable of taking all of its arguments and turning it into a single string suitable for use as multiple arguments in an automatically generated bash/sh/zsh script. Such a command is extremely useful in various disciplines of script-fu. An example of its usage:
% shsafe 'A big \nasty string '\'' $HOME $PATH' 'another string \\'
'A big \nasty string '\'' $HOME $PATH' 'another string \\'
Using it in another script:
% sshc host rm 'file/with spaces and $special chars'
where sshc contains
#!/bin/bash
# usage: sshc host command [arg ...]
# Escapes its arguments so that the command may contain special
# characters. Assumes the remote shell is sh-like.
host=$1
shift
exec ssh "$host" "$(shsafe "$#")"
Another example:
#!/bin/bash
# Run multiple commands in a single sudo session. The arguments of
# this script are passed as arguments to the first command. Useful if
# you don't want to have to type the password for both commands and
# the first one takes a while to run.
sudo bash -c "pacman -Syu $(shsafe "$#") && find /etc -name '*.pacnew'"
I couldn't find a suitable solution to this problem in the pre-existing commands, so I made up my own, called shsafe. It uses the fact that single quotes, '', turn off absolutely all shell expansion, except for ' itself.
shsafe:
#!/usr/bin/env python
from sys import *
n = len(argv)
if n == 1:
exit(0)
i = 1
while True:
stdout.write("'" + argv[i].replace("'", "'\\''") + "'")
i += 1
if i == n:
break
stdout.write(' ')
stdout.write('\n')
Is there any standard tool capable of doing this to its arguments?
Note that the printf command with a format string consisting of just the %q formatter is not good enough for this, because it won't keep multiple arguments separated:
% printf %q arg1 arg2
arg1arg2
I did eventually figure out a decent way of doing this:
% printf "$'%q' " 'crazy string \ $HOME' 'another\ string'
$'crazy\ string\ \\\ \$HOME' $'another\\\ string'
It's a little error prone what with the quotes everywhere, so it's not ideal, IMO, but it's a solid solution that should work anywhere. If it's being used a lot, you could always turn it into a shell function:
shsafe () {
printf "$'%q' " "$#"
}
I have something like this on a Jenkinsfile (Groovy) and I want to record the stdout and the exit code in a variable in order to use the information later.
sh "ls -l"
How can I do this, especially as it seems that you cannot really run any kind of groovy code inside the Jenkinsfile?
The latest version of the pipeline sh step allows you to do the following;
// Git committer email
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
Another feature is the returnStatus option.
// Test commit message for flags
BUILD_FULL = sh (
script: "git log -1 --pretty=%B | grep '\\[jenkins-full]'",
returnStatus: true
) == 0
echo "Build full flag: ${BUILD_FULL}"
These options where added based on this issue.
See official documentation for the sh command.
For declarative pipelines (see comments), you need to wrap code into script step:
script {
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
}
Current Pipeline version natively supports returnStdout and returnStatus, which make it possible to get output or status from sh/bat steps.
An example:
def ret = sh(script: 'uname', returnStdout: true)
println ret
An official documentation.
quick answer is this:
sh "ls -l > commandResult"
result = readFile('commandResult').trim()
I think there exist a feature request to be able to get the result of sh step, but as far as I know, currently there is no other option.
EDIT: JENKINS-26133
EDIT2: Not quite sure since what version, but sh/bat steps now can return the std output, simply:
def output = sh returnStdout: true, script: 'ls -l'
If you want to get the stdout AND know whether the command succeeded or not, just use returnStdout and wrap it in an exception handler:
scripted pipeline
try {
// Fails with non-zero exit if dir1 does not exist
def dir1 = sh(script:'ls -la dir1', returnStdout:true).trim()
} catch (Exception ex) {
println("Unable to read dir1: ${ex}")
}
output:
[Pipeline] sh
[Test-Pipeline] Running shell script
+ ls -la dir1
ls: cannot access dir1: No such file or directory
[Pipeline] echo
unable to read dir1: hudson.AbortException: script returned exit code 2
Unfortunately hudson.AbortException is missing any useful method to obtain that exit status, so if the actual value is required you'd need to parse it out of the message (ugh!)
Contrary to the Javadoc https://javadoc.jenkins-ci.org/hudson/AbortException.html the build is not failed when this exception is caught. It fails when it's not caught!
Update:
If you also want the STDERR output from the shell command, Jenkins unfortunately fails to properly support that common use-case. A 2017 ticket JENKINS-44930 is stuck in a state of opinionated ping-pong whilst making no progress towards a solution - please consider adding your upvote to it.
As to a solution now, there could be a couple of possible approaches:
a) Redirect STDERR to STDOUT 2>&1
- but it's then up to you to parse that out of the main output though, and you won't get the output if the command failed - because you're in the exception handler.
b) redirect STDERR to a temporary file (the name of which you prepare earlier) 2>filename (but remember to clean up the file afterwards) - ie. main code becomes:
def stderrfile = 'stderr.out'
try {
def dir1 = sh(script:"ls -la dir1 2>${stderrfile}", returnStdout:true).trim()
} catch (Exception ex) {
def errmsg = readFile(stderrfile)
println("Unable to read dir1: ${ex} - ${errmsg}")
}
c) Go the other way, set returnStatus=true instead, dispense with the exception handler and always capture output to a file, ie:
def outfile = 'stdout.out'
def status = sh(script:"ls -la dir1 >${outfile} 2>&1", returnStatus:true)
def output = readFile(outfile).trim()
if (status == 0) {
// output is directory listing from stdout
} else {
// output is error message from stderr
}
Caveat: the above code is Unix/Linux-specific - Windows requires completely different shell commands.
this is a sample case, which will make sense I believe!
node('master'){
stage('stage1'){
def commit = sh (returnStdout: true, script: '''echo hi
echo bye | grep -o "e"
date
echo lol''').split()
echo "${commit[-1]} "
}
}
For those who need to use the output in subsequent shell commands, rather than groovy, something like this example could be done:
stage('Show Files') {
environment {
MY_FILES = sh(script: 'cd mydir && ls -l', returnStdout: true)
}
steps {
sh '''
echo "$MY_FILES"
'''
}
}
I found the examples on code maven to be quite useful.
All the above method will work. but to use the var as env variable inside your code you need to export the var first.
script{
sh " 'shell command here' > command"
command_var = readFile('command').trim()
sh "export command_var=$command_var"
}
replace the shell command with the command of your choice. Now if you are using python code you can just specify os.getenv("command_var") that will return the output of the shell command executed previously.
How to read the shell variable in groovy / how to assign shell return value to groovy variable.
Requirement : Open a text file read the lines using shell and store the value in groovy and get the parameter for each line .
Here , is delimiter
Ex: releaseModule.txt
./APP_TSBASE/app/team/i-home/deployments/ip-cc.war/cs_workflowReport.jar,configurable-wf-report,94,23crb1,artifact
./APP_TSBASE/app/team/i-home/deployments/ip.war/cs_workflowReport.jar,configurable-temppweb-report,394,rvu3crb1,artifact
========================
Here want to get module name 2nd Parameter (configurable-wf-report) , build no 3rd Parameter (94), commit id 4th (23crb1)
def module = sh(script: """awk -F',' '{ print \$2 "," \$3 "," \$4 }' releaseModules.txt | sort -u """, returnStdout: true).trim()
echo module
List lines = module.split( '\n' ).findAll { !it.startsWith( ',' ) }
def buildid
def Modname
lines.each {
List det1 = it.split(',')
buildid=det1[1].trim()
Modname = det1[0].trim()
tag= det1[2].trim()
echo Modname
echo buildid
echo tag
}
If you don't have a single sh command but a block of sh commands, returnstdout wont work then.
I had a similar issue where I applied something which is not a clean way of doing this but eventually it worked and served the purpose.
Solution -
In the shell block , echo the value and add it into some file.
Outside the shell block and inside the script block , read this file ,trim it and assign it to any local/params/environment variable.
example -
steps {
script {
sh '''
echo $PATH>path.txt
// I am using '>' because I want to create a new file every time to get the newest value of PATH
'''
path = readFile(file: 'path.txt')
path = path.trim() //local groovy variable assignment
//One can assign these values to env and params as below -
env.PATH = path //if you want to assign it to env var
params.PATH = path //if you want to assign it to params var
}
}
Easiest way is use this way
my_var=`echo 2`
echo $my_var
output
: 2
note that is not simple single quote is back quote ( ` ).
I set a variable with spaces in a string to a new bash:
VAR='my variable with spaces' /bin/bash
And now if I want to start a new bash with the same environment, I would do something like:
ENV=$(cat /proc/self/environ | xargs -0 | grep =)
env -i - $ENV /bin/bash
But the thing is, in /proc/self/environ, this variable is without quotes. So the last command throws a: env: variable: No such file or directory
How can I work around this limitation?
PS: this is a simplified version of the following issue: https://github.com/jpetazzo/nsenter/issues/62
I think the answer here is to not use a shell script to set things up. Using a higher-level language makes it much easier to parse /proc/<PID>/environ into something useful. Here's a short example:
#!/usr/bin/python
import os
import sys
import argparse
def parse_args():
p = argparse.ArgumentParser()
p.add_argument('pid')
p.add_argument('command', nargs=argparse.REMAINDER)
return p.parse_args()
def main():
args = parse_args()
env = {}
with open('/proc/%s/environ' % args.pid) as fd:
for envspec in fd.read().split('\000'):
if not envspec:
continue
varname, varval = envspec.split('=', 1)
env[varname] = varval
print env
os.execvpe(args.command[0], args.command, env)
if __name__ == '__main__':
main()
Put this in a file called env-from, make it executable, and then you
can run:
env-from <pid> bash
And you'll get a shell using the environment variables from the
specified process.
Just add -L1 to xargs (max non-blank input lines per command line):
xargs -0 -L1 -a /proc/self/environ
This will give you each variable on a separate line, which makes it easier to process. Or simply use
strings /proc/self/environ
I've written a simple shell script "sample.sh" as below
#!/bin/bash
PARAM1="Parameter1"
PARAM2=\"\"
echo "param1-->[$PARAM1] - param2-->[$PARAM2]"
# sample is a compiled binary that just prints it's command line arugments.
./sample -param1 $PARAM1 -param2 $PARAM2
The script is run with -x option as
bash -x sample.sh
The output which I got is
[tspot#raspberrypi : ~/src/sample]$ bash -x sample.sh
+ PARAM1=Parameter1
+ PARAM2='""'
+ echo 'param1-->[Parameter1] - param2-->[""]'
param1-->[Parameter1] - param2-->[""]
+ ./sample -param1 Parameter1 -param2 '""'
arg[0] - [./sample]
arg[1] - [Parameter1]
arg[2] - [""]
[tspot#raspberrypi : ~/src/sample]$
My doubt is why do we get a single quote surrounding the empty string in -param2 in the below line
+ ./sample -param1 Parameter1 -param2 '""'
I would need the line to be
+ ./sample -param1 Parameter1 -param2 ""
Thanks in Advance. Someone please help me out.
That's just the way bash -x formats things in it's debug output. It's adding the extra ' ' to indicate that the string is literally "" and not an empty string. If you look a few lines below, you can see that ./sample does have the expected output: arg[2] - [""].
The problem is that your variable $PARAM2 is the string "" (literally two double quotes).
For what you want, I think you just need to initialize it with :
PARAM2=""
# Or
PARAM2=''
Both are equivalent here.
FYI, the difference between single ' and double " quotes in Bash, is that you can put variable $foo in double quotes and they will be evaluated, whereas it will show literal $ in single quotes :
$ foo=bar
$ echo ">$foo<"
>bar<
$ echo '>$foo<'
>$foo<