I am a bit new to azure and today I am trying to create a pipeline for publishing npm package to azure artifactory.
The issue is - that after pipeline successfully built, I can see the published package in the artifacts. However, published package is almost empty.
There is only package.json and readme.md. No dist folder at all.
Here is my Pipeline:
# Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install
npm run build
npm publish
displayName: 'npm install and build and publish'
Also, when I build the project locally and run npm publish - the package is published as it should,all files in place.
Is there is something I am doing wrong ?
Finally I found the issue.
The Pipeline definition was actually right, besides one little thing:
versionSpec: '10.x'
Version of the Node was incorrect! The pretty old one. Originally the definition was copied from one of the azure official manuals, so the version was from some really old year.
versionSpec: '14.x'
And build was successful with all files on their place.
Hope that will be helpful for somebody here.
when publishing packages to npm, you need to authenticate with your credential. You could run it successfully on local because of the .npmrc file saved on your computer. When running npm publish on CI, the file doesn't exist, which results in an error. Please try the following steps:
Generate an automation access token from this URL: https://www.npmjs.com/
Go to your repo, and add a file named ".npmrc", enter the content with //registry.npmjs.org/:_authToken={your-token-value}
Notice:
It is recommended to set the access token as an environment variable in Pipeline library.
Please use lowercase words as the package's name in package.json. Otherwise you will receive a 400 error.
Related
The agent is not able to build and generate the build folder for react project while using the windows platform with Microsoft and self hosted agent but works fine with ubuntu.
This is the linux yml file with Microsoft agent
https://gist.github.com/yogeswaran-gnrgy/0354d455e6c85d387281eb75d1a326f1
This is the windows yml file with Microsoft agent
https://gist.github.com/yogeswaran-gnrgy/816b9f06dbe0039c07ad1293d2fce141
This is the log generated during the build step using Microsoft agent
https://gist.github.com/yogeswaran-gnrgy/acbc3c2a268ea3b514cc423726b0a751
In case of Self-hosted agent it has both npm and node installed. What can be the problem
In Linux
- script: |
echo Executing install
npm install
echo Executing build
npm run build
displayName: 'Building the project'
When used as script npm install and npm build works fine only in Linux. For windows, it must be like
- task: Npm#1
displayName: 'Installing dependencies'
inputs:
verbose: false
customCommand: install
- task: Npm#1
displayName: 'Building the project'
inputs:
command: custom
verbose: false
customCommand: run build
Apologies in advance that I can't share source for this one:
I've got a client that has to use Azure DevOps Pipelines to build a github enterprise hosted project.
It is a perfectly regular node.js project with jest specified as a devDependency in package.json.
When the npm install runs on an Azure Pipeline, jest doesn't get installed. I created a local x64 linux agent on Ubuntu 18 on my desktop, and it doesn't get installed their either but when I manually run npm install inside the /s/ directory it's all okay.
What is Azure Devops doing to the script that this is the result?
What is Azure Devops doing to the script that this is the result?
Test to run Npm install in Azure Pipeline and local machine, it seems that they have the same behavior. The Jest will be installed in node_modules folder.
Here are my steps, you could refer to it.
The File structure. I also add the jest to devDependency in package.json.
"devDependencies": {
"enzyme": "^3.3.0",
"enzyme-adapter-react-16": "^1.1.1",
"enzyme-to-json": "^3.3.3",
"jest": "^22.4.3"
}
Use the Npm install task in Azure Pipeline.
- task: Npm#1
displayName: 'npm install'
inputs:
workingDir: package
verbose: false
Run the pipeline and the Jest will be installed in the node_modules folder.
By the way, I test the same steps on the Microsoft-hosted agents: ubuntu-18.04 and it could work fine.
Updates
For running Npm install with the Script:
Here is an example:
steps:
- script: |
cd $(build.sourcesdirectory)/package
npm install
displayName: 'Command Line Script'
The first step is used to navigate to the source folder(contains package.json file).
Then the jest will be installed.
I tried to set up an Azure Build Pipeline that uses Bazel (0.26.0)
My pipeline YAML definition file looks like this:
trigger:
- master
pool:
vmImage: 'windows-2019'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
bazel version
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
Currently, I try only to find out which Bazel version is installed by calling bazel version - but Azure DevOps reports:
'bazel' is not recognized as an internal or external command,
operable program or batch file.
Cmd.exe exited with code '9009'.
I wonder how I can install and run Bazel in a Azure pipeline - any hints on this?
It seems that this project got it working. But I do not understand how.
Update in case anyone comes across this thread as of 2022:
Bazel is now installed on Microsoft-hosted agents for macOS-latest, ubuntu-latest, and windows-latest.
Try running this pipeline; you'll be able to see which version of Bazel is installed on each OS.
strategy:
matrix:
linux:
imageName: ubuntu-latest
mac:
imageName: macOS-latest
windows:
imageName: windows-latest
pool:
vmImage: $(imageName)
steps:
- script: bazel version
displayName: Show bazel version
You got this error because you use Microsoft-hosted agent, in those agents bazel is not installed. In the example you provided they use Self Hosted (Private) Agent and they install bazel in their agent machine.
1) Install Self Hosted Agent in your private machine and install bazel in the machine.
2) Install bazel during the build pipeline with choco (simple script task):
choco install bazel
After you install it you can use it.
P.S I tried to install via choco and I got an error but bazel indeed installed and in the next step bazel version gave results, so in the installation task put continueOnError = true. (the error is in the python step, if your project not with python it's ok).
This is how I install Bazel and Azure Pipeline on a windows-2019 image:
steps:
- script: |
echo 'Install Bazel via Choco'
choco install bazel
displayName: 'Install dependencies'
- script: |
bazel version
displayName: 'Show bazel version'
In the meantime, Bazel and Bazelisk come already preinstalled. Seems Bazel takes over the world.
Tested and works with: macOS-10.15, macOS-11, macOS-12, ubuntu-20.04, ubuntu-22.04, windows-2019, windows-2022
I am new to node.js, I was trying to deploy node.Js project via gitlab ci. But after spotting the build error in pipeline I realized I added node_modules folder to .gitignore, and I am not pushing node_modules to gitlab. And node_modules folder is 889MB locally there is no way I will push it, so what approach should I use to use node_module s folder from somewhere else.
i.e. node_modules path is always present and accessible on remote server! Do I need to include that path in package. Json
Can node_modules be maintained by using docker ? then how would I maintain to stay update specific to every project.
You are right not to check in the node_modules folder, they are automatically populated at the time you run npm install
This should be part of your build pipeline in the gitlab ci. The pipeline allows multiple steps and the ability to pass artefacts through to the next stage. In your case you want to save the node_modules folder that is created by running npm install you can then use the dependencies for tests or deployment.
Since npm v5 there is a lockfile to make sure what you are running locally will be the same as what you are running on the server
Also you can use something like rennovate to automatically update your dependancies if you want to fix them and automatically manage security updates. (rennovate is open source so can be ran on gitlab)
A really simple gitlab CI pipeline could be:
// .gitlab-ci.yml
stages:
- build
- deploy
build:
stage: build
script:
- npm install
artifacts:
name: "${CI_BUILD_REF}"
expire_in: 10 mins
paths:
- node_modules
deploy:
stage: deploy
script:
- some deploy command
Just mount a Gitlab in digitalocean to keep track of versions of some projects, but now I've read a little about Gitlab I wonder if you can set Gitlab CI so that each time you do a commit automatically make a build of application and if the build is successful can do a deploy to OpenShift.
I think my .gitlab-ci.yml should look something like this:
stages:
- build
- deploy
before_script:
- npm install
job_build:
stage: build
script:
- grunt build
job_deploy:
stage: deploy
But I really do not know if this is as valid and neither tell Gitlab CI must only make a git push to OpenShift repository.
After much reading and searching finally found documentation about this [1], in the end I have resolved some file using the following .gitlab-ci.yml
stages:
- build
- deploy
job_build:
stage: build
script:
- npm install -g grunt-cli
- npm rebuild node-sass
- npm install
- grunt build
job_deploy:
stage: deploy
script:
- apt-get update -yq
- apt-get install -y ruby-dev rubygems
- gem install dpl
- dpl --provider=openshift --user=$OPENSHIFT_USER --password=$OPENSHIFT_PASS --domain=mydomain --app=example
only:
- master
The magic happens with a Travis library call dpl [2] that supports a lot of providers [3]
[1]http://doc.gitlab.com/ce/ci/deployment/README.html
[2]https://github.com/travis-ci/dpl
[3]https://github.com/travis-ci/dpl#supported-providers