Using jenkins to modify files - groovy

I've just started to get to grips with Jenkins. It currently performs the following tasks:
Pulls the latest codebase from git
Uploads the codebase via sftp to my environment
Sends a notification email to the testers and the PM to inform them of a completed deployment.
However for it to be truly useful I need it to perform two more tasks:
Delete the robots.txt and .htaccess file which exists in the git repo and replace it with a predefined version which is specific for the server
Go through all the code and remove specific code-blocks (perhaps something in between comments: eg. /** Dev only **/ Code to be removed goes here /** Dev only **/ or something like that).
Are there any plugins which can accomplish these things or would I have to read up on writing groovy scripts for this sort of thing (I don't know anything about those yet).
On a related note: I'd also love it if it could combine kit and SASS files, however I can't see a plugin for these things, however I assume I can just install compass on my build server and then run it via command line in the build process. Is that correct?

Instead of putting your build tasks directly into the Jenkins job, I recommend writing a build script to accomplish your publishing/deployment tasks.
Jenkins is great for having a single point of automation that is easy to run, can publish build results, and can track successes and failures. In my experience though, you're better off not putting your individual tasks and configuration steps into the Jenkins job configuration. At some point, you'll want to be able to run this job without Jenkins, either because you want to test local changes, or you want to handle multiple jobs and trying to keep job configurations in sync is not fun, or because you're moving to another build/deployment system. Also, putting the build script into a file allows you to put it into your source control system and track changes.
My advice: choose a scripting language (Python, Ruby, Perl, whatever you're comfortable with) or build system (SCons and Rake are options) and write a build script. In Python Ruby, and Perl, it's easy to manipulate files (#1) and all have a wide choice of templating systems that will accomplish #2. Then the Jenkins job becomes running your build script on the command line (or executing through a language-specific builder). And the build script can include running any of the tasks that you decide to put in your build (compass, etc).

Related

Bamboo plan: Compress the artifact after build and uncompress after deployment to server

This is my first time where I am both learning and implementing automated CICD pipelines in Atlassian bamboo. I have a NodeJS project whose build and deployment plan I configured after much R&D over the net.
In the deployment process, I observed that the deployment is taking very much time as the number of files to be transferred are more in numbers due to node_modules probably. I would like to compress the artifact generated after build steps and want to decompress at server side once the transfer is complete.
I tried finding ZIP in the tool tasks but it is not there. My question is that is it possible in any other way. Is doing it via cmd works & is feasible?
I have a little experience over the Linux commands.
Any help would be highly appreciated.
In my company we use an Ant task including ivy to prepare, zip and publish our projects as artifacts. In the deployment we use an SCP Task to copy the artifact onto our server and an SSH task to unzip it.
So our whole build part is implemented in ant and the only thing our bamboo build does, is checking out a git repository and running the ant script.
That workflow is used for a lot of different projects including nodejs, python, java, c++ or pure text file setups and it works really well.
But a normal script task for zipping should also do the job and depending on the scale of your projects Ant may be an overkill.
I think its possible to use win/linux commands for acheiving your requirement. you would need to write a task to compress the files you can use shell plugin or any other suitable plugin. once the artifact is sent to server you would need a pooling batch program to unzip your artifact at the server end.

Hudson : Scheduling a build without a tag and generate a report

How can I schedule a build without tag over Windows, Linux and WCE in Hudson using a shell script and generate a report that will be sent to a specified server?
And so the conditions are :
1. How can I create the build without creating a new tag?
2. How is it possible to excute .sh over windows and WCE (Windows Mobile), is it simply by going through Cygwin? Moreover, having a cross-platform (3 platforms) build does it mean that I must run the build 3 times?
3. How to generate a report and save it in a directory of a server that I'm authorized to access to?
I know that I asked many questions at once. It is because this is my first use of Hudson and these are kind of details. Moreover, I don't want to make a mistake by creating new tags during my tests. The 1st and 3rd questions are the most important. If anyone gives me the right answer to them, I'll choose it as the right answer.
Thank you a lot.
first, people nowadays mostly use jenkins instead of hudson (open source, better support)
build can be started manually in hudson / jenkins, just click the green arrow. It will create a new build but won't change your repository (unless the last step of your build is creating a tag, in that case, just remove that step for testing)
Usually, .sh scripts run in shell excecutables (ash, sh, bash, csh...) and are not supported of the shell on windows. You'll have to go through cygwin or have a platform specific build command
kind of not clear for me. If you use jenkins to build a matric build (with the matrix axis being your target platform), you'll have automatically a nice report in jenkins itself (status of each build). You can keep artifacts (use post-build action : archive the artifacts) or use another plugin to publish the file you like (exemple : ftp reporting)
Sorry not being able to be more precise, that's how far I understand your questions.

how to monitor a remote read-only subversion repo for commit changes?

I have a couple of dependencies in my Java project on 3rd party libs, and some of them are undergoing development that I would like to track.
I would like to be able to be notified, (By email, desktop popup, or otherwise) when changes are committed to the remote svn repo so I can examine their impact etc.
I looked at svnmailer, but it would seem to require the repo to be local (I think??)
also I found some windows tools that do the job, but I am running linux desktop. so no go there.
worst case, I can do some cron script to poll for remote changes using the command line tools, but I would prefer some existing tool.
Sounds like a good use for a continuous integration server. Something like CruiseControl or Hudson are designed for this use case - the whole point of them is to to check your source control regularly, retrieve any changes, build the project and notify someone. In this case, it sounds like you don't even need to build the project, just send an email anyway.
If you don't already have a CI server this might seem like a little overkill but I bet once you've got one set up you'll find yourself using it again.

Deployment after CI builds

Im pretty new to CI so bear with me here. I have just setup an instance of Team City in on a local machine, and I can clearly see the benefits.
The one thing we do want understand is how we can managed the deployment aspect of CI. What we really want to achieve are two builds:
1) We check in to our source repository and the CI server notices the change and compiles the code, tests etc.
2) We manually trigger a build that compiles the code, copies the code to a remote server and update its IIS mappings.
Now the first build is pretty much wrapped up with TeamCity. But I assume that the deployment aspect of this is going to involve some scripting (Nant, MsBuild, Rake etc) is this correct?
If this is the case, I can see that transferring files from the build machine to a remote server will be ok, but will we be able to update IIS mappings without being on the same network? For that matter where is THE correct place to deploy a CI server, should is live on the same network as the apps we deploy?
Finally, we have been (rather unorthadoxily) using IronRuby to run rake scripts as our build runner. This is simply because we like Rake, but if we were to look at Nant/Msbuild do they have any baked in tasks that would simplify what we are trying to achieve?
Cheers, Chris.
We use MSBuild exclusively, just a choice. I am sure Nant and the others do things just as well. We only publish to a dev environment (for dev testing) and a stage environment (Where QA actually tests). I would not suggest that you put the production system push on this as the temptation to force builds might be too great for some people.
We use some of the MSBuild Community Tasks

deploying custom software on linux?

I write company internal software in PHP and C++.
What are the best methods of deploying this type of software to linux machine? Currently, we use svn export, are there any other methods?
We use checkinstall. Just write a simple Makefile that copies the files to target directories on the target machine and then run checkinstall to create RPM, DEB or TGZ package, which you can later easily install with distribution package management tools.
You can even add shell scripts that are executed before and after files are copied, so you can do some pre and post processing like adding user accounts, crontab entries, etc.
Once you get more advanced, you can add dependencies to these packages so it could also pull and install PHP, MySQL, Apache, GCC libraries and even required PHP or Apache modules or some extenal C++ libs you might need, all with a single command.
I think it depends on what you mean by deploy. Typically a deploy process for web projects involves a configuration scripting step in which you can take the same deploy package and cater it to specific servers (staging, development, production) by altering simple configuration directives.
In my experience with Linux serviers, these systems are often custom built, and in my experience often use rsync rather than svn export and/or scp alone.
A script might be executed from the command line like so:
$ deploy-site --package=app \
--platform=dev \
--title="Revsion 1.2"
Internally, the system would take whatever was in trunk for the given package from SVN (I'm sure you could adapt this really easily for git too), generate a new unique tag with the log entry "deploying Revision 1.2".
Then it would patch any configuration scripts with the appropriate changes (urls, hosts, database passwords, etc.) before rsyncing it the appropriate destination.
If there are issues with the deployment, it's as easy as running the same command again only this time using one of your auto-generated tags from an earlier deploy:
$ deploy-site --package=app \
--platform=dev \
--title="Reverting to Revision 1.1" \
--tag=20090714200154
If you have to also do a compile on the other end, you could include as part of your configuration patching a Makefile and then execute a command via ssh that would compile the recently deployed code once the rsync process completes.
There is, in my experience, a tradeoff between security and ease of deployment.
For my deployment, I've never had a problem using scp to move the files from one machine to another. You can write a simple BASH script to take a list of machines (from a text file or STDIN) and push a given directory/application to a given directory on all of the machines. Say you hypothetically did it to a bin directory, the end user would never know the difference.
The only problem with that would be when you have multiple architectures and OSes, where it has to be compiled on each one individually. In that case, you could just write a script (the first example that pops into my mind is Net::SSH from Ruby) to take that list of servers, cd to the given directory, and run the compilation script. However, if all machines use the same architecture and configuration, you can hypothetically just compile it once on the machine that you are using to distribute.

Resources