I have Jenkins job which is connecting to remote host via ssh and runs a bash command.
I can run the same bash command via terminal by logging into remote linux machine. But If I do so , I see env variables are different when I tried printing printenv.
Now, if I run the same bash command from Jenkins over ssh execCommand I see bit different set of env variables.
Is there any way, I can make that bash run using same env variables, no matter where I am running it from?
Related
My computer runs Windows10 Enterprise.
I found this repo for creating a Nodejs server for tchatbot. As you can see there are options for starting the server. I tried to execute this command : node app.js DF_PROJECT_ID="agent-human-handoff-sampl-jseo" DF_SERVICE_ACCOUNT_PATH="D:\Docs\TchatBot\clé_account_service_agent_human_operator\agent-human-handoff-sampl-jseo-3349b2f01974.json"
But I got error : You need to specify a path to a service account keypair in environment variable DF_SERVICE_ACCOUNT_PATH
So what is wrong ?
It's basically same as jfriend00's solution, but I add node app.js in the end. And you just follow below sequence to run command.
set DF_SERVICE_ACCOUNT_PATH="D:\Docs\TchatBot\clé_account_service_agent_human_operator\agent-human-handoff-sampl-jseo-3349b2f01974.json"
set DF_PROJECT_ID="agent-human-handoff-sampl-jseo"
node app.js
By the way, if you use linux system or macOS, you'll use following command to start server.
(Just one line)
DF_SERVICE_ACCOUNT_PATH="D:\Docs\TchatBot\clé_account_service_agent_human_operator\agent-human-handoff-sampl-jseo-3349b2f01974.json" DF_PROJECT_ID="agent-human-handoff-sampl-jseo" node app.js
You can just set these in the environment in a command shell before running nodejs from that command shell:
set DF_SERVICE_ACCOUNT_PATH="D:\Docs\TchatBot\clé_account_service_agent_human_operator\agent-human-handoff-sampl-jseo-3349b2f01974.json"
set DF_PROJECT_ID="agent-human-handoff-sampl-jseo"
Then, you you can run your program and these variables will be in the environment that your node program inherits. If you want to automate this, you can create a small batch file that will set them and then run your program. Keep in mind that setting environment variables like this sets them on for programs run from the current command shell, not other command shells and not for programs run other ways.
After setting those, your environment is now configured and you would run your program just as always:
node app.js
OS: ubuntu 14.x
Environment variable is defined at /etc/environment
export MY_VAR="Fom Xong"
my_bash.sh
#!/bin/bash
echo "My_VAR Value: $MY_VAR"
if the I run the script using ./my_bash it works perfectly.
But if given in supervisor it dont work
command = /var/www/myapp/web/bash/my_bash.sh
Any clue?
I'm have some bash script (myscript.sh), one of logical steps is to run ssh command against Windows machine running open-ssh.
When I'm running the script (myscript.sh) from the shell everything works fine.
But when I am running same script from Jenkins (CentOS 7.3) it fails to retrieve content via ssh command: ssh root#windows-server hostname.
Please need your help.
The user running the jenkins process do probably not have the correct executable rights or group membership to do so.
Try
sudo -u "jenkinsuser" myscript.sh
If that fails you confirmed the issue.
Change the execute rights on your script or put the server process owner in the right group if this is the case.
I am working with Docker on my windows machine via git bash. Since git bash does not record the current status on closing, I need to set some environment variables related to Docker every time when I start a new terminal. The command I would like to run before start-up is:
eval $(docker-machine env)
Or better yet, have a bash script including other logics. For example if docker machine is not up, start the machine first, etc. Is there a way to automatically run bash command or script before opening a new git bash window?
I would recommend creating a new file under your home folder(~/) namely ~/.bashrc which is read by your terminal when it first starts-up. Add a function say myStartUpFunction() that runs your command as you need.
myStartUpFunction() {
docker-machine env
}
myStartUpFunction
This would enable you to run the docker-machine env every time a new terminal session is opened.
I am having some difficulty running jobs over SSH. I have a series of networked machines which all have access to the same Home folder (when my executable is installed). While working on one machine I would like to be able run my code through ssh using the following sort of command:
ssh -q ExecutableDir/MyExecutable InputDir/MyInput
If I ssh in to any of the machines I wish to run the job on remotely and simply run:
ExecutableDir/MyExecutable InputDir/MyInput
It runs without fail, however when I run through SSH I get an error saying some shared libraries can't be found. Has anyone come across this sort of thing before?
ok I figured it out myself.
It seems when you run things through ssh in the way shown above you don't inherit the path variables etc. that you would if you ssh-ed in 'properly'. You can see this by running:
ssh RemoteMachine printenv
and comparing the output to what you would normally get if you were connected to the remote machine. The solution I then went for was to run something like the following:
ssh -q ExecutableDir/MyExecutable source ~/.bash_profile && InputDir/MyInput
Which then gets all the paths and stuff you might need from the bash_profile file on the remote machine