I am trying execute a bash script from my grails app. The script has a ssh call in it. I am using keys to connect between servers.
If I run the script via CLI it works as expected but when I run it via the grails app the script runs but only the part that deals with the local machine. I do not see any output from the remote script.
Any help in pointing me in the right direction will be appreciated.
Bash Script:
echo "Starting Data pushed..."
ssh -t -i /path/to/.ssh/id_rsa_remote_user remote_user#remote_server 'sudo /opt/remote_user/script.sh'
echo "Starting Data pushed... Done!"
Grails/Groovy code:
class CommonService {
def executeScript(String script, Integer waitTimeInMinutes = 1){
def sout = new StringBuilder()
def serr = new StringBuilder()
Integer exitCode = -1
def proc
try {
proc = script.execute()
proc.consumeProcessOutput(sout, serr)
proc.waitForOrKill(waitTimeInMinutes * 60 * 1000)
exitCode = proc.exitValue()
}
catch(e){
log.error e.printStackTrace()
serr << e.toString()
}
log.info "Executed Script: ${sout}"
log.info "Executed Script Extended Output: \n${serr}"
return [output: sout, error: serr, exitCode: exitCode]
}
}
Map outcome = commonService.executeScript('/path/to/local/scripts/pushToProduction.sh', waitTime)
Related
I am having issues with Apache NiFi execute Script processor.
Following the executeScript cook book tutorial
https://community.cloudera.com/t5/Community-Articles/ExecuteScript-Cookbook-part-2/ta-p/249018, I was able to write a groovy script that writes to the output stream.
I am writing a Json string to the the output stream.
However on execution on Nifi, I get an error as depicted on the following link:
https://imgur.com/jYgH8EY.png
Below is the code
import groovy.json.JsonBuilder
import org.apache.commons.io.IOUtils
import java.nio.charset.StandardCharsets
import org.apache.nifi.processor.io.StreamCallback
flowFile = session.get()
if(flowFile == null){
return;
}
def incomingFlowFileName = flowFile.getAttribute('filename')
def pathToIngestionScript = pathtobashscript.value
def command = '''
docker ps | grep 'visallo/dev' | awk '{print $1}'
'''
def containerId = ['bash','-c',command].execute().in.text
if(containerId.replaceAll("\\s","").length() != 0){
/* "docker exec -i " + container_id + " bash < " + path_to_bash_script */
"docker exec -i ${containerId} bash < ${pathToIngestionScript}".execute()
}else {
/*ingest data like a savage*/
}
def result = ["fileId":incomingFlowFileName.tokenize('*')[1],"status":"2"]
flowFile = session.write(flowFile,{outputStream ->
outputStream.write(new JsonBuilder(result).toPrettyString().getBytes(StandardCharsets.UTF_8))
} as StreamCallBack)
session.transfer(flowFile,REL_SUCCESS)
Doesn't org.apache.nifi.processor.io.StreamCallback exist in the Script execution space?
I am running Nifi 1.9.2
You have as StreamCallBack but it is StreamCallback (without the capital B)
I am running below groovy script to fetch dynamic values from aws s3 bucket.
And below script is working fine and it will fetch all objects like shown in below output.
current output is
test-bucket-name/test/folde1/1.war
test-bucket-name/test/folder2/2.war
test-bucket-name/test/folder3/3.txt
Where as I want to display only *.war files in the output from "test-bucket-name" folder like below.
1.war
2.war
My Script:
def command = 'aws s3api list-objects-v2 --bucket=test-bucket-name --output=text'
def proc = command.execute()
proc.waitFor()
def output = proc.in.text
def exitcode= proc.exitValue()
def error = proc.err.text
if (error) {
println "Std Err: ${error}"
println "Process exit code: ${exitcode}"
return exitcode
}
return output.split()
Please let me know how to extract/display only war files from test-bucket-name folder.
Firstly you need to filter the entries that end with .war. Then split every entry once again (with /) and pick the last element:
def input = '''test-bucket-name/test/folde1/1.war
test-bucket-name/test/folder2/2.war
test-bucket-name/test/folder3/3.txt'''
input
.split()
.findAll { it.endsWith('.war') }
.collect { it.split('/') }
.collect { it[-1] }
I'm trying to configure Jenkins build trigger from Jira post-function Groovy script
Here is my Groovy code:
import com.atlassian.jira.component.ComponentAccessor;
import com.atlassian.jira.issue.CustomFieldManager;
import com.atlassian.jira.issue.Issue;
import com.atlassian.jira.issue.fields.CustomField;
import com.onresolve.scriptrunner.runner.util.UserMessageUtil
def WANITOPUSHField = ComponentAccessor.getCustomFieldManager().getCustomFieldObject(10802);//customfield id
def WANITOPUSHValue = issue.getCustomFieldValue(WANITOPUSHField);
def SelectVersionField = ComponentAccessor.getCustomFieldManager().getCustomFieldObject(10805);//customfield id
def SelectVersionValue = issue.getCustomFieldValue(SelectVersionField);
if(WANITOPUSHField != null) {
if(WANITOPUSHValue.toString() == 'Yes') {
'curl --user USERNAME:PASSWORD "http://JENKINS_URL/job/deploy-dev-test/buildWithParameters?token=MYTOCKEN&ENV=1"'.execute()
UserMessageUtil.success("Jenkins Build started ");
} else {
UserMessageUtil.success("Condition Not sucsess "+WANITOPUSHValue.toString());
}
}
Here I have used curl command to trigger Jenkins build if the Jira ticket status changed, but the curl command is not working here
It is throwing output on the alert box
java.lang.UNIXProcess#4d0c79da
I don't know what its mean whether the command is executing successfully or not, Please anyone can help me on this and suggest me if I can use some different method with Groovy to achieve this
"something".execute() returns instance of UNIXProcess java class. When toString() method is not overriden you will see something like java.lang.UNIXProcess#4d0c79da
Here some code which will help you to get shell command output:
def command = 'curl --user USERNAME:PASSWORD "http://JENKINS_URL/job/deploy-dev-test/buildWithParameters?token=MYTOCKEN&ENV=1"'
def proc = command.execute()
proc.waitFor()
println "Process exit code: ${proc.exitValue()}"
println "Std Err: ${proc.err.text}"
println "Std Out: ${proc.in.text}"
I have an issue about executing shell commands on a remote server.
I'm trying various solutions and I have one working but it is not optimized in terms of maintenance : I use a batch file that launches putty which connects to the remote server ans sends the command.
ie. in groovy :
def batchFile = "C:\\Validation\\Tests_Auto\\Scripts\\remote_process\\setOldDate.bat"
Runtime.runtime.exec(batchFile)
and in my batch file :
c:
cd C:\Validation\Tests_Auto\Scripts\remote_process\
putty.exe -ssh root#xx.xx.xx.xx -pw **** -m "C:\Validation\Tests_Auto\Scripts\remote_process\setOldDate.txt"
setOldDate.txt contains the command date -s #1522018800
This works. However I'd like to launch it in a cleaner way, either avoiding the use of text file for the command or, better, avoiding using putty.
I tried several another way to do the same thing but it doesn't work. I think I'm not too far but I need a little help.
I tried to launch a direct command via ssh:
Runtime.getRuntime().exec('"c:\\Program Files\\OpenSSH\\bin\\ssh.exe" root:****#xx.xx.xx.xx date -s #1522018800')
I'd be grateful if anyone could help
thanks
#Grab(group='com.jcraft', module='jsch', version='0.1.54')
def ant = new AntBuilder()
ant.sshexec( host:"somehost", username:"dude", password:"yo", command:"touch somefile" )
for other sshexec and scp tasks parameters see doc:
https://ant.apache.org/manual/Tasks/sshexec.html
https://ant.apache.org/manual/Tasks/scp.html
for soapui
this method using apache ant + jsch-0.1.54.jar
the only way i know for soapui:
download the following libraries and put them into soapui\bin\endorsed directory (create the endorsed directory)
https://central.maven.org/maven2/org/apache/ant/ant/1.9.11/ant-1.9.11.jar
https://central.maven.org/maven2/org/apache/ant/ant-launcher/1.9.11/ant-launcher-1.9.11.jar
https://central.maven.org/maven2/com/jcraft/jsch/0.1.54/jsch-0.1.54.jar
edit the soapui\bin\soapui.bat and add the following line where other JAVA_OPTS are defined:
set JAVA_OPTS=%JAVA_OPTS% -Djava.endorsed.dirs="%SOAPUI_HOME%endorsed"
that's because ant libs must be loaded before groovy.
then the code above should work in soapui (except #Grab)
Alternatively you can download only jsch-XXX.jar into existing soapui\bin\ext directory and use jsch library directly from groovy
see examples: http://www.jcraft.com/jsch/examples/
or search for groovy jsch examples
Finally, compiling my various research and struggling to fit my environment constraints (groovy in soapui), I ended up with the following solution that works for me :
download jsch-0.1.54.jar and set it in C:\Program Files\SmartBear\ReadyAPI-2.3.0\bin\ext
use the following groovy script :
import java.util.Properties
import com.jcraft.jsch.ChannelExec
import com.jcraft.jsch.JSch
import com.jcraft.jsch.Session
def ip = context.expand( '${#Project#projectEndpoint}' )
try
{
JSch jsch = new JSch();
Session session = jsch.getSession("root","$ip", 22);
session.setPassword("****");
// Avoid asking for key confirmation
Properties prop = new Properties();
prop.put("StrictHostKeyChecking", "no");
session.setConfig(prop);
session.connect();
// SSH Channel
ChannelExec channelssh = (ChannelExec)session.openChannel("exec");
// Execute command
//channelssh.setCommand("date -s #1520018000"); // change date
channelssh.setCommand("ntpdate -u pool.ntp.org"); // restore date
channelssh.connect();
channelssh.disconnect();
}
catch (Exception e)
{
log.info "exception : " + e
System.out.println(e.getMessage());
}
finally
{
session.disconnect();
}
UPGRADE
Here is a generalization I've made as my needs evolved. The following script, still using jsch allows to send any command.
This deals with host checking and eliminates hazards due to no host checking.
User and password are passed as parameters
import java.util.Properties
import com.jcraft.jsch.ChannelExec
import com.jcraft.jsch.JSch
import com.jcraft.jsch.Session
import java.util.regex.Pattern
def ip = context.expand( '${get endpoint#endpoint}' )
ip = ip.replaceFirst("http[s]?://","")
def user = context.expand( '${#Project#ssh_user}' )
def password = context.expand( '${#Project#ssh_password}' )
def command = context.expand( '${#TestCase#command}' )
def timeout = context.expand( '${#TestCase#timeout_ms}' )
if (timeout == "")
timeout = 1000 // default timeout 1s
else
timeout = timeout.toInteger()
log.info "command = " + command
Session session
try
{
JSch jsch = new JSch();
session = jsch.getSession(user,ip, 22);
session.setPassword(password);
//log.info "user : $user"
//log.info "set password : $password"
//log.info System.getProperty("user.home")+"/.ssh/known_hosts"
jsch.setKnownHosts(System.getProperty("user.home")+"/.ssh/known_hosts");
session.connect();
//log.info "session connect"
// SSH Channel
ChannelExec channelssh = (ChannelExec)session.openChannel("exec");
// Execute command
channelssh.setCommand(command);
InputStream commandOutput = channelssh.getInputStream();
channelssh.connect();
int readByte = commandOutput.read();
outputBuffer = [];
// timeout to avoid infinite loop
while((readByte != -1) && (timeout > 0))
{
outputBuffer.add(readByte)
readByte = commandOutput.read();
timeout = timeout -1
}
// process output
outputBuffer = outputBuffer as byte[]
// convert byte array into string
output = new String(outputBuffer, "UTF-8")
sleep(3000)
//log.info "disconnect"
channelssh.disconnect();
testRunner.testCase.setPropertyValue("cmd_output", output)
}
catch (Exception e)
{
msg = "exception : " + e
log.error msg
testRunner.fail(msg)
}
finally
{
session.disconnect();
}
In groovy I am trying to run two calls to git using Process with:
def fireCommand(String command) {
def proc = command.execute()
proc.waitFor()
println "Process exit code: ${proc.exitValue()}"
def result = proc.in.text
println "Std Err: ${proc.err.text}"
println "Std Out: ${proc.in.text}"
}
def executeOnShell(String command) {
fireCommand("git --version")
fireCommand("git status")
}
But only the first call works. The second call throws:
java.io.IOException: Stream closed
From what I understand I am NOT reusing the same process so why the error?
When you use inputStream.text or inputStream.getText() method, input stream is closed before method returns result - content of the stream (as String). So when you called
println "Std Out: ${proc.in.text}"
it tried to read from the same stream that already been closed.
println "Std Out: $result"
will be OK.