I need to dynamically modify the notification e-mail recipients based on the build, so I'm using a groovy script. I want this script to be available to all jobs, so I want it to reside on the Jenkins file system and not within each project. It can be either in the recipients fields (using ${SCRIPT,...}) or in the pre-send script. A short (fixed) script that evaluates the master script is also good, as long it is the same for all projects.
You should try Config File Provider plugin. It works together with the Credentials configuration of Jenkins.
Go to Manage Jenkins > Managed files
Click Add a new Config
Select Groovy file
Enter the contents of your script
Your script will now be saved centrally on Jenkins, available to master/slave nodes.
In your Job Configuration:
Under Build Environment
Check Provide Configuration file
Select your configured groovy File from dropdown
Set Variable with a name, for example: myscript
Now, anywhere in your job, you can say ${myscript} and it will refer to absolute location of the file on filesystem (it will be somewhere in Jenkins dir).
My impression it that you would probably want to completely switch to Jenkins pipelines where the entire job is groovy file (Jenkinsfile) in the root of the repository.
Email-Ext already supports it even if it may be lacking some documentation.
https://jenkins.io/doc/pipeline/steps/email-ext/
Related
I have a jenkins job that amongst other things I want to dynamically create a .foo and a .bar file in my jenkins workspace home directory as the job is running.
How can I do that with bash as a part of the job?
The job will run after a Docker container has been created with the Docker Plugin.
So the order of precedence I want is :
Job starts:
docker container get's created
create .foo file (and enter foobarbaz as a content in the file)
create .bar file (and enter foobarbaz as a content in the file)
How can I achieve this?
One simple solution would be to associate your Jenkins job with a source controlled repository URL, which already include the files with the right content.
Each job execution would create a workspace populated with that repo content, and launch your docker container.
If not, you would need to add a script step, like this one in Groovy.
I am trying to debug a shell script that is executed via a Jenkins job. The first thing the script does is include another script that is in a completely different repo. My instinct is telling me that the user that Jenkins is executing the script from has access to the directory for the other repo through $PATH or some other similar mechanism, but nothing I’m seeing indicates this.
I’ve looked over variables in http://$host/systemInfo, tried logging on to the Linux box, switched to various users and searched through command history for each, looked at $PATH variable for each, and even tried executing a test shell script with the same include as different users. Still not seeing anything to indicate how Jenkins is able to include a file from a different repo and have not been able to get the include to work in my test script.
My main questions are:
How can I determine what user Jenkins is executing the original shell script as? I would assume user 'jenkins' but I'm not able to get the include to work in my test script executing as this user.
How is Jenkins able to include a script from a different repo?
I'm sure I'm just running into some fundamental Jenkins ignorance on my part but not finding answers. Thanks in advance for any insight.
Finally found the answer and it seems really obvious now that I see it. The Jenkins server that the job runs from has a PATH environment variable defined in the server config in the Jenkins interface. This PATH points to the directory containing the external script.
I've just started to get to grips with Jenkins. It currently performs the following tasks:
Pulls the latest codebase from git
Uploads the codebase via sftp to my environment
Sends a notification email to the testers and the PM to inform them of a completed deployment.
However for it to be truly useful I need it to perform two more tasks:
Delete the robots.txt and .htaccess file which exists in the git repo and replace it with a predefined version which is specific for the server
Go through all the code and remove specific code-blocks (perhaps something in between comments: eg. /** Dev only **/ Code to be removed goes here /** Dev only **/ or something like that).
Are there any plugins which can accomplish these things or would I have to read up on writing groovy scripts for this sort of thing (I don't know anything about those yet).
On a related note: I'd also love it if it could combine kit and SASS files, however I can't see a plugin for these things, however I assume I can just install compass on my build server and then run it via command line in the build process. Is that correct?
Instead of putting your build tasks directly into the Jenkins job, I recommend writing a build script to accomplish your publishing/deployment tasks.
Jenkins is great for having a single point of automation that is easy to run, can publish build results, and can track successes and failures. In my experience though, you're better off not putting your individual tasks and configuration steps into the Jenkins job configuration. At some point, you'll want to be able to run this job without Jenkins, either because you want to test local changes, or you want to handle multiple jobs and trying to keep job configurations in sync is not fun, or because you're moving to another build/deployment system. Also, putting the build script into a file allows you to put it into your source control system and track changes.
My advice: choose a scripting language (Python, Ruby, Perl, whatever you're comfortable with) or build system (SCons and Rake are options) and write a build script. In Python Ruby, and Perl, it's easy to manipulate files (#1) and all have a wide choice of templating systems that will accomplish #2. Then the Jenkins job becomes running your build script on the command line (or executing through a language-specific builder). And the build script can include running any of the tasks that you decide to put in your build (compass, etc).
I understand how to use the EnvInject plugin to compute variables from string parameters (the parameter name simply is an unbound variable in the groovy script performing the injection).
I want to use a Run Parameter - the parameter that contains the latest successful build of a project (or whichever build the user selects), but it appears that any of those are not available at the time the EnvInject plugin runs.
I guess I need to write groovy code to inspect the desired project myself and get the right build name directly from the model - except I can't do that, since the model lives on the master and the envInject plugin runs on the slave....
EnvInject can run at different times...
At node/master level (on startup, and available always).
At job startup, before SCM step (configured at the top of job configuration).
After SCM step (configured under "build environment" section).
As a build step.
You should either last or second-last options, if you want build parameters to be available
Lastly, there is another option to try. Get pre-scm-buildstep plugin. This allows to run any build step (including EnvInject), just before the SCM checkout. I've done a quick test, and the job parameters and their values are available at this stage.
How can the same Jenkins Groovy Postbuild Plugin step be added to all jobs? We have 50+ jobs, so it is too hard to copy-paste the link to desired groovy code to every job.
I usually do similar mass changes by updating the config.xml of the effected jobs. Every good editoru should have a search and replace function that works on files. use the following workflow.
shut down Jenkins
update job config.xml files
start up Jenkins
There are other possible workflows like the following
1. update job config.xml files
2. reload config
However, with the second option I don't know how it effects if a job is running while you reload a config.
The following script requires the Jenkins Groovy and Groovy postbuild plugins to work. It will add a Jenkins Groovy postbuild publisher to all jobs:
https://gist.github.com/genericpenguin/9ac1b84ed7a145b3b6dd
Change the view from 'All' to some other view for testing. We use it for mass changes and it works fine. Only caveat is that postbuild groovy script must reside on Jenkins master. This can be easily changed in the script if it is short (like a script loader).