CI - Using the right package.json while in other branch - node.js

i've a repo with master and branch1. I'm writing test in branch1 with mocha / chai and i've changed the package json (forked a repo and maded some changes to it), but Travis seems to build still the old one, even if i changed my package.json. I've just forked and replaced the version in the package.json with the name of the repo (like everytime).
Anyone have experienced something similar? I'm missing the right way to make Travis build the package.json that is in the Pull Request where i'm working?

just trying to clarify some things in your question :)
Are you submitting a pull request to the upstream repo that you forked from, and wanting the upstream repo's Travis integration to build your code?
If so, it may be that the upstream repo's maintainer doesn't have the "Build PR" setting turned on in Travis for their repo. You could ask them. See this question for more details. Or maybe your PR can't be merged, as described in the Travis docs.
Or are you working on a branch within your own repo, which you forked from upstream?
If so, you need to set up Travis integration yourself on your own forked repo.

I solved my problem specifing in the package.json the commit id of the node module. Probabily not the best way to accomplish this, but the test passed now.

Related

Add existing NodeJs (node, Express, monodb,...) project to existing a NX monorepo (Angular)

Has anyone experienced how to add an existing NodeJS API to an existing (Angular) Nx Monrepo?
Unfortunately the manual doesn´t help me so much
https://nx.dev/migration/manual
The process of migrating a repo into your mono repo requires a few manual steps. I think there would not be a simpler way to do it.
Assuming your node project does not share files with your current monorepo, this should be the steps:
on your node repo, create a branch 'to-monorepo' and in it, move all the files to folders that match the nx folder structure and push the commits to it.
remove your package.json file (we will later merge it with the monorepo's one)
once the folders match the nx folder structure, time to merge into the monorepo. From the monorepo add the remote of the other repo
git remote add node-repo <your git repo's node url>
at the monorepo folder, checkout your master
run a pull to make the node repo branches be available in the monorepo
git pull
create a new branch 'merging-node-repo' on your monorepo.
merge the branch node-repo/to-monorepo into your merging-node-repo branch, preserving the history:
git merge node-repo/to-monorepo --allow-unrelated-histories
push your new branch (all the code and its history will now be listed in this new branch)
remove the remote node-repo from your local monorepo configs
git remote rm node-repo
manually merge all the node repo's original package.json file dependencies into the monorepo's one, and run npm install from the monorepo. This way your package-lock.json file is updated. Once you are done, create a commit and push it.
this last step is more tricky. You have now to manually update the monorepo's config files to allow nx to start managing it. This is where the link you had in your question might help. Once you are done, create a commit and push it.
With these steps you can then merge your merging-node-repo branch into master.
I recommend you to create a separated nx workspace with a nodejs project on it. This helps you with having a baseline for all the necessary nx configurations and dependencies.
You might want to make sure your project works via nx commands from this separated workspace; this way you will have a better chance of getting configurations of your monorepo right.
Hopefully this gets you started.
Here is a solution that I wrote and used to import multiple repos into a single monorepo, under whatever subdirectories are wanted, while maintaining commit history:
https://github.com/marcuswestin/monorepo-merge
I've also found two other scripts that look like they might work, but I haven't tried them:
http://choly.ca/post/git-merge-to-monorepo/
https://github.com/ksindi/monoreaper

Make Bitbucket pipeline use new package.json version after running npm version patch

I have a Bitbucket pipeline, for a StencilJs project which has a first step where i bump the version number in package.json using npm version patch. This works fine, and i get it pushed back to the repository and all, without problems. Next step in the pipeline, is where i build the StencilJs project. The problem here is that the project is built using the old version number, not the one i bumped it to. So the original version in package.json may be 1.0.3. Step one bumps the version to 1.0.4 and pushes it to repository. I want the next step to build the components using version 1.0.4, but it doesn't. It still uses 1.0.3 when building.
Anyone how knows how i can make the build come out with version 1.0.4?
Kind regards,
Lars
I made it work by putting npm version patch into the same step as where i am doing the build.
The problem is probably that your pipeline keeps running on the commit prior to you tagging and committing back. You could set up another pipeline to do the Stencil build that triggers when a new tag is created.
To avoid infinite loop i had to add [skip-ci] in the version patch commit message
My pipeline look like this:
pipelines:
branches:
master:
- step:
name: Patch version
script:
- 'v=$(npm version -m "%s [skip ci]" patch)'
- 'git push origin ${v}'
- 'git push'

Editing an NPM package but keeping it up to date

What is the best way to edit an NPM package to have my own code but still be updated with the package (But not overwritten, kind of like git merge).
Attempt 1:
So right now I cloned the Git repo to a folder and in my package-json I declared the dependency to require that folder..
{
"dependency": "file:lib/dependency"
}
And what I also did was add the origin to my personal Git repo and add the upstream as dependency's original repo. The idea here is that when the maintainers update the original repo I can pull the upstream and when I wanna push my personal changed code I can push to origin.
Please let me know what you think of this approach and if there's anything I'm doing wrong or should do different. Thanks in advance.

Handle build with Node.js/React App - best practice with Jenkins and Docker

Today we've created a Node.js/ReactJS app. We are using Bitbucket (git repo), Docker container along with Jenkins + AWS ECS (Elastic Container Service).
The process we use today is when we are ready to deploy and have a new version ready, we go into the /assets directory and run command gulp build. This handles the whole build/minification process and in the end gives us the version number. From here we check this into the git repo and since it has the version this becomes the tag in the repo. All good, right? :)
From here, in Jenkins we can simply run the build choosing Prod/Master for example and it takes care of grabbing all the npm packages, push docker image to ECR, updating revision within ECS. And the the service is up and running.
It seems to me that we should not be running this gulp build command locally and have to check into git repo. Not to mention this leaves the git repo a bit messy and with other developers it's not a great solution to have the 'compiled' minified files there.
Wouldn't the better practice be to have this gulp build run over on Jenkins?
However, I believe we would still like to retain the tagging within the git repo? Is there another way to achieve this?
Has anyone dealt with a similar issue or has a best practice for something like this?
Really curious to hear what you think.
Thanks in advance.
There's no "best practice" but if you want it to be less messy you can look at using a Jenkinsfile: https://jenkins.io/doc/book/pipeline/jenkinsfile/
It doesnt matter what commands you have or anything, it's just best practice for Jenkins to do it. so gulp build should run on Jenkins. The only process should be commit and push. Jenkins should handle the rest.

Heroku does not update node.js > package.json Github tarball dependecies

I am managing a dependency at Github which I use as a dependency in my project. I placed the dependecy as a tarball link (viz. https://github.com/username/dependecy/tarball/master) into the package.json and it is working fine as expected locally. When I update the package at Github I can run a npm install and all dependencies including the Github tarballs get updated. However it is not the case at Heroku. tarball-linked dependencies does not get updated. Any ideas?
I had the similar problem. My App had the dependency with caret versioning like this:
"dependency": "^0.6",
So every time the dependency got its patch version updated, I wanted Heroku to have the updated dependency without any commits/pushes to my App. For that I set just in case
heroku config:set NODE_MODULES_CACHE=false
And when the new patch version became available I did the manual redeploy of the same App that was already deployed from Heroku Dashboard.
Can you try to do the same for your case? Possibly this will help you.
Since no one has answered this yet I will share what I have learned. The trick is getting heroku to think the tarball is different or new so that it downloads it again. As #celalo suggested you can remove it or change the path, commit, push, change it back, commit and push. This is messy but it works.
What I ended up doing was making a master1 branch. I keep the branch in sync with master and then alternate the tarball url between master and master1 when I need it to update.

Resources