groovy sed error sed: -e expression #1, char 1: unknown command: `'' - groovy

Groovy code is :
def cmd = "sed -i \'1 i <?xml version=\"1.1\"?>\' test.xml"
println cmd
println cmd.execute().err.text
Output:
sed -i '1 i <?xml version="1.1"?>' test.xml
sed: -e expression #1, char 1: unknown command: `''
Here is the actual command sed -i '1 i <?xml version="1.1"?>' test.xml that runs well in cli. But Groovy execute() does not work correctly. How to fix this ?
Update 1 :
Also tried with the below cmd, butstill shows same error.
def cmd = /sed -i '1 i <?xml version="1.1"?>' test.xml/
def cmd = "sed -i '1 i <?xml version=\"1.1\"?>' test.xml"
Update Note:
To check and update an xml file,
def insertversion(String filename)
{
def lines= new File (filename).readLines()
if(!(lines.get(0)).contains('xml version'))
{
def cmd = ['sed', '-i', '1 i <?xml version="1.1"?>', filename]
cmd.execute()
}
}

In this case execute shell command as a list of cmd and parameters instead of a executing command as String:
def cmd = ['sed', '-i', '1 i <?xml version="1.1"?>', 'test.xml']
println cmd
println cmd.execute().err.text
After running script like this part <?xml version="1.1"> gets duplicated in test.xml file (same behavior as running given command from command line).
What is the difference between List.execute() and String.execute()?
If you execute shell command as a String, java.lang.Runtime.exec() method will use java.util.StringTokenizer to split your input String into an array. In your case tokenizer will create 7 tokens, you can check it by running following script:
def cmd = "sed -i \'1 i <?xml version=\"1.1\"?>\' test.xml"
def tokenizer = new StringTokenizer(cmd)
def tokens = []
while (tokenizer.hasMoreTokens()) {
tokens << tokenizer.nextToken()
}
tokens.each { println it }
It outputs:
sed
-i
'1
i
<?xml
version="1.1"?>'
test.xml
You can also verify it by running debugger with a checkpoint set in java.lang.Runtime class at line 96:
This is of course incorrect. When using a list to execute shell command we will get correct array of command line parameters:
The general rule of thumb is that if your shell command contains characters that may confuse java.util.StringTokenizer it's better to use a list to define correct list of command line parameters.

Related

syntax error: unexpected end of file (expecting "fi")

So I have a simple script running in an IF statement. I always get:
syntax error: unexpected end of file (expecting "fi")
I am wondering what could be the possible solution for this ?
def call(Map config) {
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: 'JENKINS_AWS'],
sshUserPrivateKey(credentialsId: 'JENKINS-SSH', keyFileVariable: 'SSH_KEY')]) {
sh """
#!/bin/bash
source add_ssh_key "${SSH_KEY}"
source init_env "${TARGET_STAGE}"
source create-bastion-pod "${PROMETHEUS_PUSHGATEWAY}" "${PROMETHEUS_PUSHGATEWAY_PORT}"
if [ ${TARGET_STAGE} == 'dev' ]; then
cat <<-EOF | curl --data-binary #- \${BASTION_URL}/metrics/job/sone_job
# TYPE some_metric counter
some_metric{label="val1"} 42
EOF
fi
delete-bastion-pod
"""
}
}
<<- only strips tabs from the here-document; your closing delimiter appears to be indented (according to what Groovy actually presents to the shell) with a couple of spaces. Try something like
sh """
#!/bin/bash
if [ ${TARGET_STAGE} == 'dev' ]; then
cat <<EOF | curl --data-binary #- \${BASTION_URL}/metrics/job/some_job
# TYPE some_metric counter
some_metric{label="val1"} 42
EOF
fi
"""
Note that as far as the shell executing the script is concerned, the here-document and the closing EOF aren't indented at all.

How to get the file extension in Jenkins Pipeline

Using bash I can get the file extension by simply:
FILE_BASENAME="abcd.001.xyz.txt"
FILE_EXTENSION="${FILE_BASENAME##*.}"
echo $FILE_EXTENSION
I can also get the same result using a bit different syntax:
FILE_BASENAME="abcd.001.xyz.txt"
RESULT=$(echo "${FILE_BASENAME##*.}")
echo $FILE_EXTENSION
Regardless what way is used the both approaches produce a string txt.
Unfortunately the same syntax in Jenkins pipeline results to an empty string:
FILE_EXTENSION = sh(script: '$(echo "${FILE_BASENAME##*.}")', returnStdout: true).trim()
I have also tried a variation of this command with
FILE_EXTENSION = sh(script: 'echo $(echo "${FILE_BASENAME##*.}")', returnStdout: true).trim()
which doesn't work as well.
How to get the file extension in Jenkins?
Designed to support the three characters file extension:
FILE_BASENAME="abcd.001.xyz.txt"
FILE_NAME = FILE_BASENAME.replaceAll(".[a-zA-Z]{3}\$", "")
FILE_EXT = FILE_BASENAME.replaceAll(FILE_NAME, "")
echo ("FILE_NAME: ${FILE_NAME} FILE_EXT: ${FILE_EXT}")
Output:
FILE_NAME: abcd.001.xyz FILE_EXT: .txt

How do I get the output of a shell command executed using into a variable from Jenkinsfile (groovy)?

I have something like this on a Jenkinsfile (Groovy) and I want to record the stdout and the exit code in a variable in order to use the information later.
sh "ls -l"
How can I do this, especially as it seems that you cannot really run any kind of groovy code inside the Jenkinsfile?
The latest version of the pipeline sh step allows you to do the following;
// Git committer email
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
Another feature is the returnStatus option.
// Test commit message for flags
BUILD_FULL = sh (
script: "git log -1 --pretty=%B | grep '\\[jenkins-full]'",
returnStatus: true
) == 0
echo "Build full flag: ${BUILD_FULL}"
These options where added based on this issue.
See official documentation for the sh command.
For declarative pipelines (see comments), you need to wrap code into script step:
script {
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
}
Current Pipeline version natively supports returnStdout and returnStatus, which make it possible to get output or status from sh/bat steps.
An example:
def ret = sh(script: 'uname', returnStdout: true)
println ret
An official documentation.
quick answer is this:
sh "ls -l > commandResult"
result = readFile('commandResult').trim()
I think there exist a feature request to be able to get the result of sh step, but as far as I know, currently there is no other option.
EDIT: JENKINS-26133
EDIT2: Not quite sure since what version, but sh/bat steps now can return the std output, simply:
def output = sh returnStdout: true, script: 'ls -l'
If you want to get the stdout AND know whether the command succeeded or not, just use returnStdout and wrap it in an exception handler:
scripted pipeline
try {
// Fails with non-zero exit if dir1 does not exist
def dir1 = sh(script:'ls -la dir1', returnStdout:true).trim()
} catch (Exception ex) {
println("Unable to read dir1: ${ex}")
}
output:
[Pipeline] sh
[Test-Pipeline] Running shell script
+ ls -la dir1
ls: cannot access dir1: No such file or directory
[Pipeline] echo
unable to read dir1: hudson.AbortException: script returned exit code 2
Unfortunately hudson.AbortException is missing any useful method to obtain that exit status, so if the actual value is required you'd need to parse it out of the message (ugh!)
Contrary to the Javadoc https://javadoc.jenkins-ci.org/hudson/AbortException.html the build is not failed when this exception is caught. It fails when it's not caught!
Update:
If you also want the STDERR output from the shell command, Jenkins unfortunately fails to properly support that common use-case. A 2017 ticket JENKINS-44930 is stuck in a state of opinionated ping-pong whilst making no progress towards a solution - please consider adding your upvote to it.
As to a solution now, there could be a couple of possible approaches:
a) Redirect STDERR to STDOUT 2>&1
- but it's then up to you to parse that out of the main output though, and you won't get the output if the command failed - because you're in the exception handler.
b) redirect STDERR to a temporary file (the name of which you prepare earlier) 2>filename (but remember to clean up the file afterwards) - ie. main code becomes:
def stderrfile = 'stderr.out'
try {
def dir1 = sh(script:"ls -la dir1 2>${stderrfile}", returnStdout:true).trim()
} catch (Exception ex) {
def errmsg = readFile(stderrfile)
println("Unable to read dir1: ${ex} - ${errmsg}")
}
c) Go the other way, set returnStatus=true instead, dispense with the exception handler and always capture output to a file, ie:
def outfile = 'stdout.out'
def status = sh(script:"ls -la dir1 >${outfile} 2>&1", returnStatus:true)
def output = readFile(outfile).trim()
if (status == 0) {
// output is directory listing from stdout
} else {
// output is error message from stderr
}
Caveat: the above code is Unix/Linux-specific - Windows requires completely different shell commands.
this is a sample case, which will make sense I believe!
node('master'){
stage('stage1'){
def commit = sh (returnStdout: true, script: '''echo hi
echo bye | grep -o "e"
date
echo lol''').split()
echo "${commit[-1]} "
}
}
For those who need to use the output in subsequent shell commands, rather than groovy, something like this example could be done:
stage('Show Files') {
environment {
MY_FILES = sh(script: 'cd mydir && ls -l', returnStdout: true)
}
steps {
sh '''
echo "$MY_FILES"
'''
}
}
I found the examples on code maven to be quite useful.
All the above method will work. but to use the var as env variable inside your code you need to export the var first.
script{
sh " 'shell command here' > command"
command_var = readFile('command').trim()
sh "export command_var=$command_var"
}
replace the shell command with the command of your choice. Now if you are using python code you can just specify os.getenv("command_var") that will return the output of the shell command executed previously.
How to read the shell variable in groovy / how to assign shell return value to groovy variable.
Requirement : Open a text file read the lines using shell and store the value in groovy and get the parameter for each line .
Here , is delimiter
Ex: releaseModule.txt
./APP_TSBASE/app/team/i-home/deployments/ip-cc.war/cs_workflowReport.jar,configurable-wf-report,94,23crb1,artifact
./APP_TSBASE/app/team/i-home/deployments/ip.war/cs_workflowReport.jar,configurable-temppweb-report,394,rvu3crb1,artifact
========================
Here want to get module name 2nd Parameter (configurable-wf-report) , build no 3rd Parameter (94), commit id 4th (23crb1)
def module = sh(script: """awk -F',' '{ print \$2 "," \$3 "," \$4 }' releaseModules.txt | sort -u """, returnStdout: true).trim()
echo module
List lines = module.split( '\n' ).findAll { !it.startsWith( ',' ) }
def buildid
def Modname
lines.each {
List det1 = it.split(',')
buildid=det1[1].trim()
Modname = det1[0].trim()
tag= det1[2].trim()
echo Modname
echo buildid
echo tag
}
If you don't have a single sh command but a block of sh commands, returnstdout wont work then.
I had a similar issue where I applied something which is not a clean way of doing this but eventually it worked and served the purpose.
Solution -
In the shell block , echo the value and add it into some file.
Outside the shell block and inside the script block , read this file ,trim it and assign it to any local/params/environment variable.
example -
steps {
script {
sh '''
echo $PATH>path.txt
// I am using '>' because I want to create a new file every time to get the newest value of PATH
'''
path = readFile(file: 'path.txt')
path = path.trim() //local groovy variable assignment
//One can assign these values to env and params as below -
env.PATH = path //if you want to assign it to env var
params.PATH = path //if you want to assign it to params var
}
}
Easiest way is use this way
my_var=`echo 2`
echo $my_var
output
: 2
note that is not simple single quote is back quote ( ` ).

Environment variable with spaces in a string - How to use them from /proc/pid/environ

I set a variable with spaces in a string to a new bash:
VAR='my variable with spaces' /bin/bash
And now if I want to start a new bash with the same environment, I would do something like:
ENV=$(cat /proc/self/environ | xargs -0 | grep =)
env -i - $ENV /bin/bash
But the thing is, in /proc/self/environ, this variable is without quotes. So the last command throws a: env: variable: No such file or directory
How can I work around this limitation?
PS: this is a simplified version of the following issue: https://github.com/jpetazzo/nsenter/issues/62
I think the answer here is to not use a shell script to set things up. Using a higher-level language makes it much easier to parse /proc/<PID>/environ into something useful. Here's a short example:
#!/usr/bin/python
import os
import sys
import argparse
def parse_args():
p = argparse.ArgumentParser()
p.add_argument('pid')
p.add_argument('command', nargs=argparse.REMAINDER)
return p.parse_args()
def main():
args = parse_args()
env = {}
with open('/proc/%s/environ' % args.pid) as fd:
for envspec in fd.read().split('\000'):
if not envspec:
continue
varname, varval = envspec.split('=', 1)
env[varname] = varval
print env
os.execvpe(args.command[0], args.command, env)
if __name__ == '__main__':
main()
Put this in a file called env-from, make it executable, and then you
can run:
env-from <pid> bash
And you'll get a shell using the environment variables from the
specified process.
Just add -L1 to xargs (max non-blank input lines per command line):
xargs -0 -L1 -a /proc/self/environ
This will give you each variable on a separate line, which makes it easier to process. Or simply use
strings /proc/self/environ

groovy execute with parameters containing spaces

How do I provide arguments containing spaces to the execute method of strings in groovy? Just adding spaces like one would in a shell does not help:
println 'ls "/tmp/folder with spaces"'.execute().text
This would give three broken arguments to the ls call.
The trick was to use a list:
println(['ls', '/tmp/folder with spaces'].execute().text)
Sorry man, none of the tricks above worked for me.
This piece of horrible code is the only thing that went thru:
def command = 'bash ~my_app/bin/job-runner.sh -n " MyJob today_date=20130202 " '
File file = new File("hello.sh")
file.delete()
file << ("#!/bin/bash\n")
file << (command)
def proc = "bash hello.sh".execute() // Call *execute* on the file
One weird trick for people who need regular quotes processing, pipes etc: use bash -c
['bash','-c',
'''
docker container ls --format="{{.ID}}" | xargs -n1 docker container inspect --format='{{.ID}} {{.State.StartedAt}}' | sort -k2,1
'''].execute().text
Using a List feels a bit clunky to me.
This would do the job:
def exec(act) {
def cmd = []
act.split('"').each {
if (it.trim() != "") { cmd += it.trim(); }
}
return cmd.execute().text
}
println exec('ls "/tmp/folder with spaces"')
More complex example:
println runme('mysql "-uroot" "--execute=CREATE DATABASE TESTDB; USE TESTDB; \\. test.sql"');
The only downside is the need to put quotes around all your args, I can live with that!
did you tried escaping spaces?
println 'ls /tmp/folder\ with\ spaces'.execute().text

Resources