Jenkins ssh remote hosts and keyfile path differences on slaves - linux

We have a Jenkins server that uses the SSH plugin to configure a SSH remote hosts within the global Jenkins configuration. These ssh connections are using a public/private key for authentication to the remote host.
We then use these configured SSH remote hosts in the build step "Execute shell script on remote host using ssh" (I believe this is also part of the SSH plugin) in a number of jobs.
The problem I'm having is that any job using the execute shell script on remote host using ssh must be running on a Windows slave, since I haven't found a way to put in some sort of relative path to the keyfile.
On windows the file would be located at: C:\Users\<username>\.ssh\
On linux the file would be located at: /home/<username>/.ssh/
I've tried many iterations of using system environment variables, setting environment variables on the node configuration page and using these as part of the keyfile path without any luck.
Am I missing something? Is there a better way to handle it. There must be a way to manage keyfile locations and differences between ssh remote hosts across slaves.

Unfortunately, I believe there isn't a way to specify a relative path — the keyfile path configured must be the same on every build slave. Far from ideal, I know.
I'm not sure how Windows would handle it, but perhaps something like /ssh/whatever.key would work, if you were to place the file at c:\ssh\whatever.key and /ssh/whatever.key for Windows and Linux machines, respectively.
In any case, the plugin has since been modified to use the Jenkins Credentials plugin, which allows you to manage username/password or private key-based credentials from the Jenkins UI, without having to place files on disk.
However, although this has been integrated into the SSH plugin, there has not yet been a new release containing this functionality, but it looks like it should be coming "soon".
So if the workaround doesn't work, you can try to:
Wait for a new release
Post on the jenkinsci-users list to ask about a new release
Download and install a recent build of the plugin
(though I would be careful to back up the existing job config before trying this; or try it on a separate Jenkins instance)

Related

Is it possible to write a shell script that takes input then will ssh into a server and run scripts on my local machine?

I have several scripts on my local machine. These scripts run install and configuration commands to setup my Elasticsearch nodes. I have 15 nodes coming and we definitely do not want to do that by hand.
For now, let's call them Script_A, Script_B, Script_C and Script_D.
Script_A will be the one to initiate the procces, it currently contains:
#!/bin/bash
read -p "Enter the hostname of the remote machine: " hostname
echo "Now connecting to $hostname!"
ssh root#$hostname
This works fine obviously and I can get into any server I need to. My confusion is running the other scripts remotely. I have read few other articles/SO questions but I'm just not understanding the methodology.
I will have a directory on my machine as follows:
Elasticsearch_Installation
|
|=> Scripts
|
|=> Script_A, Script_B, etc..
Can I run the Script_A, which remotes into the server, then come back to my local and run Script_B and so on within the remote server without moving the files over?
Please let me know if any of this needs to be clarified, I'm fairly new to the Linux environment in general.. much less running remote installs from scripts over the network.
Yes you can. Use ssh in non interactive mode, it will be like launching a command in your local environment.
ssh root#$hostname /remote/path/to/script
Nothing will be changed in your local system, you will be at the same point where you launched the ssh command.
NB: this command will ask you a password, if you want a really non interactive flow, set up host a passwordless login, like explained here
How to ssh to localhost without password?
You have a larger problem than just setting up many nodes: you have to be concerned with ongoing maintenance and administration of all those nodes, too. This is the space in which configuration management systems such as Puppet, Ansible, and others operate. But these have a learning curve to overcome, and they require some infrastructure of their own. You would probably benefit from one of them in the medium-to-long term, but if your new nodes are coming next week(ish) then you probably want a solution that you can use immediately to get up and going.
Certainly you can ssh into the server to run commands there, including non-interactively.
My confusion is running the other scripts remotely.
Of course, if you want to run your own scripts on the remote machine then they have to be present there, first. But this is not a major problem, for if you have ssh then you also have scp, the secure copy program. It can copy files to the remote machine, something like this:
#!/bin/bash
read -p "Enter the hostname of the remote machine: " hostname
echo "Now connecting to $hostname!"
scp Script_[ABCD] root#${hostname}:./
ssh root#hostname ./Script_A
I also manage Elasticsearch clusters with multiple nodes. A hack that works for me is using Terminator Terminal Emulator and split it into multiple windows/panes, one for each ES node. Then you can broadcast the commands you type in one window into all the windows.
This way, you run commands & view their results almost interactively across all nodes parallely. You could also save this layout of windows in Terminator, and then you can get this view quickly using a shortcut.
PS, this approach will only work of you have only small number of nodes & that too for small tasks only. The only thing that will scale with the number of nodes & the number of times and variety of tasks you need to perform will probably be a config management solution like Puppet or Salt.
Fabric is another interesting project that may be relevant to your use case.

How do I get a Jenkins server to push bash code to a different server?

I have Jenkins installed on a Linux server. It can run builds on itself. I want to create either a Freestyle Project or an External Job that transfers a bash script and runs it on two separate linux servers. Where in the GUI do I configure the destination server when I create a build? I have added "nodes" in the GUI. I can see the free space of the servers in the Jenkins GUI, so I know the credentials work. But when I create a build, I see no field that would tell Jenkins to push the bash scripts and run them on certain servers.
Are Jenkins nodes just servers that lend computing power to the master server? Or are they the targets of Jenkins builds? I believe that Jenkins "slaves" provide computing power to the Jenkins master server.
Normally Jenkins is used to integrate code. What do you call the servers that Jenkins pushes code into? They would be called Chef clients or Puppet agents if I was using Chef or Puppet for integrating code. I've been doing my own research, but I don't seem to know the specific vocabulary.
I've been working with such tools for several years. And for as far as I know there isn't a Ubiquitous Language for this.
The node's you can configure in Jenkins itself to add 'computing power' are indeed called build slaves.
Usually, external machines you will copy to, deploy to or otherwise use in jobs are called "target machine". As it will be the target of an action in your job.
Nodes can be used in several forms, you can use agents, which will require a small installation on the node machine. Which will create a running agent service with which Jenkins can communicate.
Another way is simply allow Jenkins to connect to a machine via ssh and let it execute commands there. Both are called nodes and could be called build slaves. But the first are usually dedicated nodes while the second can be any kind of machine as long as the ssh user can execute the build.
I also have not found any different terms for these two types.
It's probably not a real answer to your questions, but I do hope it helped.

Jenkins publish over SSH - same commands for all servers

I want to use the publish over SSH Jenkins plugin to transfer some files and then run some commands on a target linux machine. That target machine needs to be specified as a build parameter though as we have around 10 potential target machines.
I know I can use parameterized publishing to choose which host to use for the publish over SSH build step, but it seems I have to define every possible server here and duplicate the file set and commands for each one. Is there a way to supply the file set and commands just once and have this apply to all potential target machines?
ie Can the bit in the red circle in this screenshot be specified just once for both servers? And I don't mean just put these commands in a script and transfer that for all servers - I'd still need to put the transfer command in each server configuration block.
Thanks,
Sarah

Is there any jsch ChannelSftp's function work like command 'cp'

These days,I am work with jsch-0.1.41,operate resources on a remote linux server via ChannelSftp.I find that there is no function provide the functionality similar to shell command "cp".Now I want to copy a file from a directory to the other,these two directory both remote directory on linux server.
Any wrong point in my presentation,please point it out.Thanks.
The SFTP protocol doesn't offer such a command, and thus also JSch's ChannelSftp doesn't offer it.
You have basically two choices:
Use a combination of get and put, i.e. download the file and upload it again. You can do this without local storage (simply connect one of the streams to the other), but this still requires moving the data twice through the network (and encrypting/decrypting twice), where it wouldn't be really necessary. Use this only if the other way doesn't work.
Don't use SFTP, but use an exec channel to execute a copy command on the server. On unix servers, this command is usually named cp, on Windows servers likely copy. (This will not work if the server's administrator somehow limited your account to SFTP-only access.)

Is there anyway to transfer .war or .jar from one server to another with Hudson?

I am using Hudson and wondering if is there anyway to transfer one file(this file is on a Linux Server)to another linux server.
Maybe use scp command from linux, just wanna know if somebody already did this, and maybe point me to the right direction.
Thanks
The SCP plugin should fit your needs.
It lets you choose between using a key or username/ password. Destinations are configured in the central Hudson/ Jenkins config, then you can choose a destination to upload to in a specific job. And in the job you can specify a pattern matching the desired file(s) to upload.
In your hudson job? Sure, you could use scp in an "execute shell" build step. For getting this to work you probably need to have public-key authentication setup between the two hosts, IIRC scp does not allow you to specify a password.
There are also several plugins for copying build artifacts to other systems (see http://wiki.hudson-ci.org/display/HUDSON/All+Plugins+by+Topic#AllPluginsbyTopic-Artifactuploaders), for instance there is an scp artifact archiver.

Resources