Clean workspace in Jenkins2 - jenkins-2

I have an issue in Jenkins2 that a build is failing and only clean to workspace fix the problem. In Jenkins1 I was able to clean the workspace manually, in Jenkins2 I don't see this issue.
I'm deleting the remote branches and recraete them to clean the workspace but I can't do that for master.
How can I clean the workspace without delete the remote branch?

I've found that the problem was a broken symlink in node_modules directory so I've used to following task to clean it:
task cleanBrokenSymlinksInNodeModules << {
file("node_modules").eachFileRecurse {
if (! it.exists()) { it.delete() }
}
}
I've copy this solution from GRADLE-1843

Related

How to prevent `anchor build` from updating Cargo.lock automatically?

I was following the tutorial with its repo. After running anchor build, I noticed a few files have been added/updated so I could not pull the repo anymore. I tried to revert any changes by deleting the target folder and removed the package.json, yarn.lock, and node_modules.
Then I found whenever I undo changes in Cargo.lock (the one at the repo root), it would add the changes back by itself as if it refuses to revert to the repo's commit version!
Why? How can I do this?

NPM: How to build a project with git-based dependencies without having to call 'git' command?

I have an NPM project that uses a git dependency
{
"repository": {
"type": "git",
"url": "https://bitbucket.org/my-private-group"
},
"dependencies": {
"my-dependency": "bitbucket:group/lib#version",
},
}
Now I want to build this project in CI using Docker with node installed.
Problem: node install tries to call git and fails because git is not there. But even if I install git it still requires authentication because it is a private repository.
At the moment I see the following solutions:
I would have to install git in docker and add an SSH key to be able to download the source code.
I may pack the related repository into the Docker image and use npm link. But this option still requires knowing dependencies set up in package.json which makes it complicated.
Setup an own npm repository to post artifact and do not use git dependencies. This option is unfortunately not achievable in my case.
Question: What is the best way of handling git dependencies in CI? Are there any other options a part from the listed options. What is the best practice?
Pulling from git without git installed is kinda hard. And installing git is easy. Just list is as a dependency for your project. This project requires windows/linux/mac os, node js, git.
You're allowing people to pull people from a private repo... that moment they have access to your source code... so all use of having the repo private is lost anyway. Anyone who wants to duplicate your code can do easily the moment it's on their computer, even if it's obfuscated.
So, I would go back a step and ask you to start asking yourself why is the repo private? Is it code that is only distributed when an NDA is present? If so, you could consider working with ssh key files to log in.
Or, you could host your files on a privately hosted gogs server, where you whitelist IP's in the firewall/nginx router that can pull from the gogs repository on your server.
If you want anyone to be able to use your repository in the final distribution of your project, you're better off lifting the private setting of your repository. You might even get some free help fixing some bugs.
I believe bitbucket has something called deployment keys which gives read only access to repositories. I am using deployment keys to build my private projects and its private dependencies.
The private key is stored in the CI server (Jenkins) and is injected into the appropriate project during its build process.
Another way is to use deployment keys with the private key stored in the project itself which can then be used during the build process.
Update
Assuming Jenkins Pipeline the following is an example of how to access ssh keys set in Jenkins using the Credentials Binding Plugin
stage('Sample') {
agent {
docker {
image 'node:12'
}
}
steps {
withCredentials([
sshUserPrivateKey(
credentialsId: 'ssh-key-name-here',
keyFileVariable: 'GIT_DEPLOY_KEY_FILE'
)
]) {
sh "cat $GIT_DEPLOY_KEY_FILE"
}
}
}
Update Sept 16th 2019
I recently came across the Build Enhancements made in Docker 18.09.
I have not yet to explore this but I think it can be used to solve the credential problem.

How to get the outbuild build directory in VSTS after a YARN BUILD

I created a build definition using yarn custom extensions.
The build works fine:
But, apparently the yarn build task does not generate any output
What am I missing here to be able to generate a build output that I can deploy to azure?
Update 1:
I was able to configure the copy, but its copying the entire folder even with node_modules to the drop. There should be a build folder only
The task "Publish Artifact" publishes the build artifacts that exist on the agent in folder "a".
If you don't copy the output to folder "a", the publish task will not publish anything and you will get the message:
Directory '...\a' is empty. Nothing will be added to build...
Before the Publish Artifact task, you need to add the Copy Files task, in this task you should copy the Yarn output to the folder "a" in the agent, after it the publish will success.

Copy files to directory outside project with Grunt

I'm looking for a more efficient way to deploy my WordPress themes. Right now I create the theme and when finished I copy the contents into a new folder without all my Node, Grunt and other dev files. My dev environment runs on DesktopServer, which has an auto-deploy option, but this also copies unwanted dev files.
Could I use Grunt to create a task that when fired copies specific files and folders from /themes/dev-theme/ to /themes/production-ready-theme/ ? This way I have a clean theme that can easily be zipped or uploaded to the production server.
Update: I just thought of a possible solution to run grunt-contrib-copy from my themes directory. This Grunt module would let me control which files to copy. But perhaps there is a more clean or efficient method to accomplish this task.
By using the Grunt Shell module you could simply add this to your grunt file:
grunt.initConfig({
shell: {
moveTemlates: {
command: 'mv /themes/dev-theme/* /themes/production-ready-theme/'
}
}
});
The grunt-contrib-copy module does have the possibility to copy files up the directory tree after all. I just tried it out using these setting in my grunt.js file and it worked.
Grunt Shell gets the job done. But if you already have grunt-contrib-copy installed in your project you could just use this.
copy: {
main: {
src: ['css/*', 'img/*', 'icons/*', 'js/*', 'lang/*', 'lib/*', '*.php', '*.css'],
dest: '../production-theme/'
}
},

Copy userContent folder using the Jenkins workflow plugin

I am trying to build a new pipeline a I need to copy the files that I have in the userContent folder of jenkins to the workspace.
How can I do this?
Probably the cleanest approach is to install the Git userContent plugin. At that point you can probably do something like (untested):
node {
git "${env.JENKINS_URL}userContent.git"
}

Resources