Copy userContent folder using the Jenkins workflow plugin - groovy

I am trying to build a new pipeline a I need to copy the files that I have in the userContent folder of jenkins to the workspace.
How can I do this?

Probably the cleanest approach is to install the Git userContent plugin. At that point you can probably do something like (untested):
node {
git "${env.JENKINS_URL}userContent.git"
}

Related

Add existing NodeJs (node, Express, monodb,...) project to existing a NX monorepo (Angular)

Has anyone experienced how to add an existing NodeJS API to an existing (Angular) Nx Monrepo?
Unfortunately the manual doesn´t help me so much
https://nx.dev/migration/manual
The process of migrating a repo into your mono repo requires a few manual steps. I think there would not be a simpler way to do it.
Assuming your node project does not share files with your current monorepo, this should be the steps:
on your node repo, create a branch 'to-monorepo' and in it, move all the files to folders that match the nx folder structure and push the commits to it.
remove your package.json file (we will later merge it with the monorepo's one)
once the folders match the nx folder structure, time to merge into the monorepo. From the monorepo add the remote of the other repo
git remote add node-repo <your git repo's node url>
at the monorepo folder, checkout your master
run a pull to make the node repo branches be available in the monorepo
git pull
create a new branch 'merging-node-repo' on your monorepo.
merge the branch node-repo/to-monorepo into your merging-node-repo branch, preserving the history:
git merge node-repo/to-monorepo --allow-unrelated-histories
push your new branch (all the code and its history will now be listed in this new branch)
remove the remote node-repo from your local monorepo configs
git remote rm node-repo
manually merge all the node repo's original package.json file dependencies into the monorepo's one, and run npm install from the monorepo. This way your package-lock.json file is updated. Once you are done, create a commit and push it.
this last step is more tricky. You have now to manually update the monorepo's config files to allow nx to start managing it. This is where the link you had in your question might help. Once you are done, create a commit and push it.
With these steps you can then merge your merging-node-repo branch into master.
I recommend you to create a separated nx workspace with a nodejs project on it. This helps you with having a baseline for all the necessary nx configurations and dependencies.
You might want to make sure your project works via nx commands from this separated workspace; this way you will have a better chance of getting configurations of your monorepo right.
Hopefully this gets you started.
Here is a solution that I wrote and used to import multiple repos into a single monorepo, under whatever subdirectories are wanted, while maintaining commit history:
https://github.com/marcuswestin/monorepo-merge
I've also found two other scripts that look like they might work, but I haven't tried them:
http://choly.ca/post/git-merge-to-monorepo/
https://github.com/ksindi/monoreaper

How to solve node_modules when I push nodejs project to server use jenkins?

I use the Publish over ftp plugins to push the code to server.The build time is too long because the node_modules.
Should I make a war or use npm install after ftp?
if possible, I suggest you to use git, add one file named as .gitignore file and add below line in it.
node_modules/*
and save it

Handle build with Node.js/React App - best practice with Jenkins and Docker

Today we've created a Node.js/ReactJS app. We are using Bitbucket (git repo), Docker container along with Jenkins + AWS ECS (Elastic Container Service).
The process we use today is when we are ready to deploy and have a new version ready, we go into the /assets directory and run command gulp build. This handles the whole build/minification process and in the end gives us the version number. From here we check this into the git repo and since it has the version this becomes the tag in the repo. All good, right? :)
From here, in Jenkins we can simply run the build choosing Prod/Master for example and it takes care of grabbing all the npm packages, push docker image to ECR, updating revision within ECS. And the the service is up and running.
It seems to me that we should not be running this gulp build command locally and have to check into git repo. Not to mention this leaves the git repo a bit messy and with other developers it's not a great solution to have the 'compiled' minified files there.
Wouldn't the better practice be to have this gulp build run over on Jenkins?
However, I believe we would still like to retain the tagging within the git repo? Is there another way to achieve this?
Has anyone dealt with a similar issue or has a best practice for something like this?
Really curious to hear what you think.
Thanks in advance.
There's no "best practice" but if you want it to be less messy you can look at using a Jenkinsfile: https://jenkins.io/doc/book/pipeline/jenkinsfile/
It doesnt matter what commands you have or anything, it's just best practice for Jenkins to do it. so gulp build should run on Jenkins. The only process should be commit and push. Jenkins should handle the rest.

Copy files to directory outside project with Grunt

I'm looking for a more efficient way to deploy my WordPress themes. Right now I create the theme and when finished I copy the contents into a new folder without all my Node, Grunt and other dev files. My dev environment runs on DesktopServer, which has an auto-deploy option, but this also copies unwanted dev files.
Could I use Grunt to create a task that when fired copies specific files and folders from /themes/dev-theme/ to /themes/production-ready-theme/ ? This way I have a clean theme that can easily be zipped or uploaded to the production server.
Update: I just thought of a possible solution to run grunt-contrib-copy from my themes directory. This Grunt module would let me control which files to copy. But perhaps there is a more clean or efficient method to accomplish this task.
By using the Grunt Shell module you could simply add this to your grunt file:
grunt.initConfig({
shell: {
moveTemlates: {
command: 'mv /themes/dev-theme/* /themes/production-ready-theme/'
}
}
});
The grunt-contrib-copy module does have the possibility to copy files up the directory tree after all. I just tried it out using these setting in my grunt.js file and it worked.
Grunt Shell gets the job done. But if you already have grunt-contrib-copy installed in your project you could just use this.
copy: {
main: {
src: ['css/*', 'img/*', 'icons/*', 'js/*', 'lang/*', 'lib/*', '*.php', '*.css'],
dest: '../production-theme/'
}
},

How to use remote paths in gitlab ci?

I installed GithubHQ in one server and GitlabCI in another server. But now I need do integration between GitlabHQ and GitlabCI. When I go to add a new project in GitlabCI he requests a path .git project, but the project is on another server where the GitlabHQ.
I tried use the path remote, like: http://[domain-name]/[user]/[project].git but he not accept.
I researched about how GitlabCI search the path and found that it does not support remote paths. He use "Rugged::Repository.new(path)" just to get the project on the server.
Does anyone know a way to use paths .git remotes in GitlabCI?
As illustrated by Issue 36:
Actually the purpose of gitlab-ci implies that you install it on deployment point. You install it where you deploy your project
So you are supposed to use a local non-bare repo.
You could, in your case, clone your remote repo on the gitlab-ci server, and use that local path.
In order to build an integration between gitlab and gitlab-ci:
add gitlab_ci user to git group for read access
clone your project via git clone /home/git/repositories to somewhere like /home/gitlab_ci/projects/...
add this project to ci.
setup gitlabhq to use ci service
Thats all.
On gitlab push it will trigger gitlab ci to make git fetch origin, so testing repo will be always up to date.

Resources