I am working on Jenkins for CICD stuff. I have two linux machines machine1 and machine2. I have installed Jenkins on machine1 and using JenkinsFile and groovy to copy a file from machine1 to machine2 using scp sh command, but it failing because it is prompting for credentials on running JenkinsFile on runtime which cannot be provided to it everytime. So is there any way to copy a file without prompting for credentials to machine2. Thanks in advance.
Jenkins has an existing mechanism to share files between different nodes. The stash command lets you put some items in a named stash (you can select files in an ant-style format) and then unstash them on a different node.
This should solve your issues with credentials.
You can see an example here.
You could use pub/private keys instead of a password. If you do not provide a passphrase while installing them it should work without prompting for credentials.
Have a look at any of the following which explain in simple steps what commands to run to set up passwordless secure shell access:
http://www.phcomp.co.uk/Tutorials/Unix-And-Linux/ssh-passwordless-login.html
http://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/
Related
I'm currently working with Linux VMs and I use Jenkins Pipelines to run various jobs written in bash. I have 2 options regarding where the code is wrote and maintained:
In pipelines with sh '#some code' (Git integrated)
In bash scripts placed in the VM with sh './bashscript'
Which one would you suggest?
Use GIT to store scripts or code related, as GIT is a version control system, and all users who have access can access the file for viewing or making changes.
When the Jenkins job runs, a workspace folder is created on the server in which the job is running on, and the script would be copied from GIT into the folder.
Gerrit trigger returns
"C:\Windows\System32\config\systemprofile.ssh\id_rsa" does not exist.
However the file exists, Jenkins can clone repos using it.
This is on Windows server 2016
So first i had to move the key inside C:\ so Gerrit trigger can see it...
Second i had to restart Jenkins because even if i click save the configuration is not saved...
Im using it also on Ubuntu never had such problems there.
I use jenkins to do auto deployment weekly to a tomcat server, and it is fairly simple to do using the "curl" with the tomcat manager. and since i am only uploading a .war file, so its very straight forward.
But when comes to a backend console application, Anyone has any idea how to use jenkins to upload an entire "set of folders with files" onto a linux box? The project that i have is built via ant and has all the folder inside the SVN.
A couple things come to mind.
Probably the most straightforward thing to do is use the ant scp task to push the directory / directories up to the server. You'll need the jsch jar on your Ant classpath to make it work, but that's not too bad to deal with. See the Ant docs for the scp task here. If you want to keep your main build script clean, just make another build script that Jenkins can run named 'deploy.xml' or similar. This has the added benefit that you can use it from places other than Jenkins.
Another idea is to check them out directly on the server from SVN. Again, ant can probably help you with this if you use the sshexec task, and run the subversion task inside of that. SSHexec docs here
Finally, Jenkins has a "Publish Over SSH" plugin you might try out. I've not used it personally, but it looks promising! Right over here!
I've manually installed ANT on many servers simply by unzipping the ant files in a location and setting up the ~/.bash_profile to configure the users' path to see it.
I need to automate the setup now on servers which do not have internet connectivity.
We are using Nolio for deployment, but I don't care if the automation is done via nolio. If it can be scripted, I can easily just make Nolio call the script.
I don't think editing the users' .bash_profiles is a good way to do the automation.
So, assuming I get Ant on to the servers and unzip it, what's the best way to install it so that all users will have access to it?
You can try using pssh (parallel ssh). It's pretty awesome. Create a file with all your remote hosts, run:
pssh -h "command1 && command2 && command3"
You can use pscp to deliver scripts, then use pssh to execute them. Works very well. Alternatively, you could become a puppet master and work everything off puppet. You can do some cool stuff with it, like automating builds based on hostname convention. LAMP build? Name the host web01.blarg.awesome or whatever, setup puppet to recognize it based on a regex, then deliver the appropriate packages.
GL.
I'm using following command to export my repository to a local path:
svn export --force svn://localhost/repo_name /share/Web/projects/project_name
Is there any, quite easy (Linux newbie here) way to do the same over FTP protocol, to export repository to a remote server?
Last parameter of svn export AFAIK have to be a local path and AFAIK this command does not support giving paths in form of URLs, like for example:
ftp://user:pass#server:path/
So, I thing there should be some script hired here to do the job.
I have asked some people about that, and was advised that the easiest way is to export repository to a local path, transfer it to an FTP server and then purge local path. Unfortunately I failed after first step (extract to local path! :) So, the support question is, if it can be done on-the-fly, or really have to be split into two steps: export + ftp transfer?
Someone also advised me to setup local SVN client on remote server and do simple checkout / update from my repository. But this is solution possible only if everything else fails. As I want to extract pure repository structure, without SVN files, which I would get, when go this way.
BTW: I'm using QNAP TS-210, a simple NAS device, with very limited Linux on board. So, many command-line commands as good as GUI are not available to me.
EDIT: This is second question in my "chain". Even, if you help me to succeed here, I won't be able to automate this job (as I'm willing to) without your help in question "SVN: Force svn daemon to run under different user". Can someone also take a look there, please? Thank you!
Well, if you're using Linux, you should be able to mount an ftpfs. I believe there was a module in the Linux kernel for this. Then I think you would also need FUSE.
Basically, if you can mount an ftpfs, you can write your svn export directly to the mounted folder.
not sure about FTP, but SSH would be a lot easier, and should have better compression. An example of sending your repo over SSH may look like:
svnadmin dump /path/to/repository |ssh -C username#servername 'svnadmin -q load /path/to/repository/on/server'
URL i found that info was on Martin Ankerl's site
[update]
based on the comment from #trejder on the question, to do an export over ssh, my recomendation would be as follows:
svn export to a folder locally, then use the following command:
cd && tar czv src | ssh example.com 'tar xz'
where src is the folder you exported to, and example.com is the server.
this will take the files in the source folder, tar and gzip them and send them over ssh, then on ssh, extract the files directly to the machine....
I wrote this a while back - maybe it would be of some use here: exup