what is the general flow of the
build-pipeline -> release-pipeline?
build-pipeline:
npm install
npm run build (builds the app)
copy files (artifact)
publish artifacts
release-pipeline?
what am i supposed to do here?
Create a new docker image? (the docker image itself also runs a build)
Do i need to create a new docker image in the previous build stage and make an artifact out of it, and then in the release-pipeline push that new image to heroku?
Related
We need to build docker images using self-hosted linux agent which is deployed as docker container (in Azure Container Instances).
As of now, the agent is Ubuntu image, however to enable building images inside this container I thought of using Kaniko image. However, I haven't figured out how to run Kaniko image without executing the kaniko itself right away (as we need to run devops agent primarily and run kaniko on-demand).
Any hints? Or better ideas how to build docker images in running docker container?
Solved with following code, however Kaniko does not work as expected when running inside my container (tested the same parameters with kaniko inside my container and in default container and in my container does not work (cannot authenticate to ACR)).
Might end up with the VMSS DevOps agent...
FROM whatever-base-image
...
COPY --from gcr.io/kaniko-project/executor /kaniko/executor /kaniko/executor
Ref: https://github.com/GoogleContainerTools/kaniko/issues/2058#issuecomment-1104666901
I would like to do a Java build into pipeline artifact and then put it in docker file.
i can't figure out a way to add artifact to a dockerfile in one pipeline
I would like one pipeline to build java code and build a docker image with jar.
Azure DevOps Pipeline - build Docker with artifacts
You could set the target path to $(Build.ArtifactStagingDirectory) for the task build java code.
Then, add a Docker task after above the publish task, configured to "Build" and set the "Build Context" in the task to $(Build.ArtifactStagingDirectory). That's the root path Docker will use for commands like COPY in a Dockerfile.
And set the Dockerfile path in the task to match its location:
FROM mcr.microsoft.com/dotnet/core/aspnet:2.2
WORKDIR /app
COPY . .
ENTRYPOINT ["dotnet", "myAppNameHere.dll"]
Since you've set the Docker Build Context to $(Build.ArtifactStagingDirectory), where your app has been published, the COPY command will use that as a "current working directory." The translation of the COPY is "copy everything in $(Build.ArtifactStagingDirectory) to the /app folder inside the container."
I have a node.js website that runs locally fine with node server.js. I added a Dockerfile:
FROM node:carbon
VOLUME ["/root"]
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm#5+)
COPY package*.json ./
RUN npm install
# If you are building your code for production
# RUN npm install --only=production
# Bundle app source
COPY . .
EXPOSE 8080
CMD [ "npm", "start" ]
And if I deploy my app with gcloud app deploy I can get it accessible online via a url. I believe my project is an 'App Engine' project? If I run subsequent gcloud app deploy commands, my new code gets pushed to the online site. But I can't get github master commits trigger and publish a new build.
I tried adding a trigger so that everytime new code gets added to my public github repo master branch, it gets sent to my production URL.
Full Trigger:
So I merge a PR into the master branch of my github repo. I look in my build history and see there is a new build, clicking the commit takes me to the new pr I just merged into the master branch of my github repo.
But If I access my website url, the new code is not there. If I run cloud app deploy again eventually it will appear, my trigger seems to be working fine from the logs, why is my build not getting published?
I think the problem might be with the fact that you're using a Dockerfile instead of Cloud Build configuration file... Unless there's something else I'm not seeing.
Look here under the fourth step, fifth bullet, for the solution. It says:
Under Build configuration, select Cloud Build configuration file.
I'm currently using Azure DevOps Server 2019 (on-premise) to deploy an ASP.NET App (CI-CD).
Is it possible to deploy this app to run via a docker container to a Windows VM?
i'm currently following the examples on this link on how to run an ASP.NET App on a docker container.
https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/docker/building-net-docker-images?view=aspnetcore-3.1
How could i do the same by utilising Azure DevOps Server 2019 to do so.
basically most of not all of the resources/guides/how-to s saw are pointing deploy to the azure cloud or docker hub.
Is it possible to deploy this app to run via a docker container to a Windows VM?
Yes it is possible, You will need to create a self-hosted agent on the Windows VM to which you deploy your app. You can just use powershell task to run docker build and docker run on the self-hosted agent without the need to upload the image to ACR/dockerhub.
Of course You can aslo upload the built image to ACR/dockerhub as #Aravind mentioned. And have a powershell task that pull the image.
The main idea is to use a powershell task to run docker command on the agent hosted on the Windows VM. You can refer to below steps.
1,create a self-hosted agent. Please check the detailed steps here.
2,create a build pipeline.
Here is an example to create a yaml pipeline.
Here is an example to create a classic UI pipelie.
3, Customize your build pipeline, Use a single powershell task to run docker build and docker run command as described in the tutorial. (You can also use docker task to build and push image to ARC/Dockerhub, and then use powershell task to pull and run the image as #Aravind mentioned.)
steps:
- powershell: |
docker build -t aspnetapp .
docker run -it --rm -p 5000:80 --name aspnetcore_sample aspnetapp
displayName: 'PowerShell Script'
Noted: please make sure docker is installed on the Windows VM(the powershell task will invoke the docker cli installed on the VM). And choose the self-hosted agent(hosted on Windows VM) to run your pipeline by choosing the agent pool where the self-hosted agent resides(the agent pool that includes the self-hosted agent is decided at the creation of the agent.)
I'm working on a Node.js application for which my current Dockerfile looks like this:
# Stage 0
# =======
FROM node:10-alpine as build-stage
WORKDIR /app
COPY package.json yarn.lock ./
RUN yarn install
COPY . ./
RUN yarn build
# Stage 1
# =======
FROM nginx:mainline-alpine
COPY --from=build-stage /app/build /usr/share/nginx/html
I'd like to integrate this into a GitLab CI pipeline but I'm not sure if I got the basic idea. So far I know that I need to create a .gitlab-ci.yml file which will be later picked up by GitLab.
My basic idea is:
I push my code changes to GitLab.
GitLab builds a new Docker image based on my Dockerfile.
GitLab pushes this newly create image to a "production" server (later).
So, my question is:
My .gitlab-ci.yml should then contain something like a build job which triggers... what? The docker build command? Or do I need to "copy" the Dockerfile content to the CI file?
GitLab CI executes the pipeline in the Runners that need to be registered into the project using generated tokens (Settings/CI CD/Runners). You also can used Shared Runners for multiple projects. The pipeline is configured with the .gitlab-ci.yml file and you can build, test, push and deploy docker images using the yaml file, when something is done in the repo (push to branch, merge request, etc).
It’s also useful when your application already has the Dockerfile that
can be used to create and test an image
So basically you need to install the runner, register it with the token of your project (or use Shared Runners) and configure your CI yaml file. The recommended aproach is docker in docker but it is up to you. You can also check this basic example. Finally you can deploy your container directly into Kubernetes, Heroku or Rancher. Remember to safely configure your credentials and secrets in Settings/Variables.
Conclusion
GitLab CI is awesome, but I recommend you to firstly think about your git workflow to use in order to set the stages in the .gitlab-ci.yml file. This will allow you to configure your node project as a pipeline an then it would be easy to export to other tools such as Jenkins pipelines or Travis for example.
build job trigger:
option 1:
add when: manual in the job and you can run the job by manual in CI/CD>Pipelines
option 2:
only:
- <branchname>
in this case the job start when you push into the defined branch
(this my personal suggest)
option 3:
add nothin' and the job will run every time when you push code
Of corse you can combine the options above.
In addition may star the job with web request by using the job token.
docker build command will work in pipeline. I think in script section.
Requirements docker engine on the gitlab-runner which pick the job.
Or do I need to "copy" the Dockerfile content to the CI file?
no