I've manually installed ANT on many servers simply by unzipping the ant files in a location and setting up the ~/.bash_profile to configure the users' path to see it.
I need to automate the setup now on servers which do not have internet connectivity.
We are using Nolio for deployment, but I don't care if the automation is done via nolio. If it can be scripted, I can easily just make Nolio call the script.
I don't think editing the users' .bash_profiles is a good way to do the automation.
So, assuming I get Ant on to the servers and unzip it, what's the best way to install it so that all users will have access to it?
You can try using pssh (parallel ssh). It's pretty awesome. Create a file with all your remote hosts, run:
pssh -h "command1 && command2 && command3"
You can use pscp to deliver scripts, then use pssh to execute them. Works very well. Alternatively, you could become a puppet master and work everything off puppet. You can do some cool stuff with it, like automating builds based on hostname convention. LAMP build? Name the host web01.blarg.awesome or whatever, setup puppet to recognize it based on a regex, then deliver the appropriate packages.
GL.
Related
I need to build a RPM for a set of libraries, but the problem is I can compile and build it in Dev server and need to deploy in all QA, PROD and test servers which has different user names, so I need to build my rpm to contain and pick username for respective servers during installation, can someone help me with this. How to do this using puppet..? Is that possible?
This cannot be done cleanly with stock RPMs.
As a hack, you can have a %post script chown all the files after detecting which username to use, but that's going to cause errors later when you do things like try to verify the RPM installation.
I'd recommend (a) having a new username that you use for the RPM/service across all servers or (b) building separate RPMs for each target environment.
I work on a SaaS based product of my company which is hosted on private cloud. So every time a fresh BOM package is available by the DEV team. In the common share folder , we- the testing team installs the build on our application servers (3 multi node servers, with one being primary and the other two being secondary).
The build installation is entirely done manually.on the three app servers(linux machine), where in the steps we follow are as below
Stop all the app servers
Copy the latest build from a code repository server(copy the .zip build file)
Unzip the content s if the folder on to a folder in the appserver (using the unzip command)
Run backup of existing running build on all three folders( command is something like - ant-f primaryBackup.xml, ant-f secondary backup.xml )
Then run the install on all three serverscommand is something like - ant-f primaryInstall.xml, ant-f secondaryInstall.xml )
Then restart all the server and check if the latest build is successfully applied.
Question: I am wanting to automate this entire process, such that I am just required to give the latest build number to be installed and the script takes care of the whole installation .
Presently I don't understand how this can be done ? Where should I start? Is this feasible? Will a shell script of the entire process be the solution?
There are many build automation/continuous deployment tools out there that would help you with a solution for automating your deployment pipeline. Some of the more popular configuration automation tools out there are puppet, chef, ansible, and saltstack. I only have experience with ansible and chef but my interpretation has been that chef is the more "user-friendly" option. I would start there... (chef uses the ruby language and ansible uses python).
I can answer specific questions about this, but hour original question is really open ended and broad.
free tutorials: https://learn.chef.io/
EDIT: I do not suggest provisioning your servers/deployments using bash scripts... that is generally messy and as your automation grows (which it likely will), your code will gradually become unmanageable. Using something like chef, you could set periodic checks for new code in your repositories and deploy when new code is detected (or upon certain conditions being met). you could write strait bash code within a ruby bock that will remotely stop/start a service like this (example):
bash 'Copying the conf file' do
cwd "current/working/directory"
user 'user_name'
code <<-EOH
nohup ./startservice.sh &
sleep 2m
nohup ./startservice.sh &
sleep 3m
EOH
end
to copy code from git for example... I am assuming github in this example, as i do not know where your code resides:
git "/opt/mysources/couch" do
repository "git://git.apache.org/couchdb.git"
reference "master"
action :sync
ssh_wrapper "/some/path/git_wrapper.sh"
end
lets say that your code is anywhere else.. bamboo or Jenkins for example... there is a ruby/chef resource for it or some way to call it using strait ruby code.
This is something that "you" and your team will have to figure out a strategy for.
You could untar a file with a tar resource like so:
tar_package 'http://pgfoundry.org/frs/download.php/1446/pgpool-3.4.1.tar.gz' do
prefix '/usr/local'
creates '/usr/local/bin/pgpool'
end
or use the generic linux command to like so:
execute 'extract_some_tar' do
command 'tar xzvf somefile.tar.gz'
cwd '/directory/of/tar/here'
not_if { File.exists?("/file/contained/in/tar/here") }
end
You can start up the servers in the way that I wrote the first block of code (assuming they are services.. if you need to restart the actual servers, then you can just run init 6 or something.
This is just en example of the flexibility these utilities offer
I use jenkins to do auto deployment weekly to a tomcat server, and it is fairly simple to do using the "curl" with the tomcat manager. and since i am only uploading a .war file, so its very straight forward.
But when comes to a backend console application, Anyone has any idea how to use jenkins to upload an entire "set of folders with files" onto a linux box? The project that i have is built via ant and has all the folder inside the SVN.
A couple things come to mind.
Probably the most straightforward thing to do is use the ant scp task to push the directory / directories up to the server. You'll need the jsch jar on your Ant classpath to make it work, but that's not too bad to deal with. See the Ant docs for the scp task here. If you want to keep your main build script clean, just make another build script that Jenkins can run named 'deploy.xml' or similar. This has the added benefit that you can use it from places other than Jenkins.
Another idea is to check them out directly on the server from SVN. Again, ant can probably help you with this if you use the sshexec task, and run the subversion task inside of that. SSHexec docs here
Finally, Jenkins has a "Publish Over SSH" plugin you might try out. I've not used it personally, but it looks promising! Right over here!
I'm using following command to export my repository to a local path:
svn export --force svn://localhost/repo_name /share/Web/projects/project_name
Is there any, quite easy (Linux newbie here) way to do the same over FTP protocol, to export repository to a remote server?
Last parameter of svn export AFAIK have to be a local path and AFAIK this command does not support giving paths in form of URLs, like for example:
ftp://user:pass#server:path/
So, I thing there should be some script hired here to do the job.
I have asked some people about that, and was advised that the easiest way is to export repository to a local path, transfer it to an FTP server and then purge local path. Unfortunately I failed after first step (extract to local path! :) So, the support question is, if it can be done on-the-fly, or really have to be split into two steps: export + ftp transfer?
Someone also advised me to setup local SVN client on remote server and do simple checkout / update from my repository. But this is solution possible only if everything else fails. As I want to extract pure repository structure, without SVN files, which I would get, when go this way.
BTW: I'm using QNAP TS-210, a simple NAS device, with very limited Linux on board. So, many command-line commands as good as GUI are not available to me.
EDIT: This is second question in my "chain". Even, if you help me to succeed here, I won't be able to automate this job (as I'm willing to) without your help in question "SVN: Force svn daemon to run under different user". Can someone also take a look there, please? Thank you!
Well, if you're using Linux, you should be able to mount an ftpfs. I believe there was a module in the Linux kernel for this. Then I think you would also need FUSE.
Basically, if you can mount an ftpfs, you can write your svn export directly to the mounted folder.
not sure about FTP, but SSH would be a lot easier, and should have better compression. An example of sending your repo over SSH may look like:
svnadmin dump /path/to/repository |ssh -C username#servername 'svnadmin -q load /path/to/repository/on/server'
URL i found that info was on Martin Ankerl's site
[update]
based on the comment from #trejder on the question, to do an export over ssh, my recomendation would be as follows:
svn export to a folder locally, then use the following command:
cd && tar czv src | ssh example.com 'tar xz'
where src is the folder you exported to, and example.com is the server.
this will take the files in the source folder, tar and gzip them and send them over ssh, then on ssh, extract the files directly to the machine....
I wrote this a while back - maybe it would be of some use here: exup
I Have the needs to run a php+sqlite (or mysql...) website from a cd.
For windows I found a nice solution (xampp) but I need to make the cd compatible with osx and linux too!
So I basically want to start a webserver (without having to install it) and then automagically open a browser pointing to the local copy of the website.
Thank you in advance
there is xampp for linux but the best way to run in it from cd is to create a executable shell script to deploy and set/unset parameters, launch software and deal with configuration
although you could use html5 alternatives javascript+sqlite.