I have dev repository with build & deploy stages, i want to create a production repository with deploy stage only, i dont want to build there. How to configure this production repository with deploy microservices without building microservices (build stage)? Deploy stage should take images from dev repo & deploy it to production k8s. I need to find how to pass images from dev repo to prod repo?
Related
what is the general flow of the
build-pipeline -> release-pipeline?
build-pipeline:
npm install
npm run build (builds the app)
copy files (artifact)
publish artifacts
release-pipeline?
what am i supposed to do here?
Create a new docker image? (the docker image itself also runs a build)
Do i need to create a new docker image in the previous build stage and make an artifact out of it, and then in the release-pipeline push that new image to heroku?
I am currently trying to deploy to multiple countries with Gitlab.
My current .gitlab-ci.yml
action-deploy:
stage: deploy
script:
- ls $PROJECT_URL
environment:
name: DE/$CI_COMMIT_BRANCH
url: https://$PROJECT_URL
General problem:
Here the DE (Germany) is a single country. How Do I gather all available XXXXXXX/$CI_COMMIT_BRANCH environments from the gitlab project settings and deploy to all of them?
Main question:
Is there a way in Gitlab CI to get all available environments?
I am trying to build ci cd pipeline in gitlab for our bitbucket repository. There are two projects in single bitbucket repository frontend (angular) and backend (.net core). I am trying to build gitlab pipeline for that project. But how to write gitlab-ci.yaml to build, test and deploy for ci cd pipeline. There are no option to write .gitlab-ci.yaml for both project. Repository hierarchy is given below:
there are two folder web(frontend project- angular) and backend (backend project- .net core)
You could work with parent-child pipelines.
So in your pipeline you build the frontend part and then call the pipeline for the backend part.
stages:
- frontend
- backend
frontend:
stage: frontend
script:
- echo "build frontend"
backend:
stage: backend
trigger:
include:
- project: 'tutorsinc'
file: '.build-backend.yml'
strategy: depend
I have a GitLab CI pipeline which builds a few artifacts. For example:
train:job:
stage: train
script: python script.py
artifacts:
paths:
- artifact.csv
expire_in: 1 week
Now I deploy the repository to OpenShift using the following step in my GitLab pipeline. This will pull my GitLab repo inside OpenShift. It does not include the artifacts from the 'testing'.
deploy:app:
stage: deploy
image: ayufan/openshift-cli
before_script:
- oc login $OPENSHIFT_DOMAIN --token=$OPENSHIFT_TOKEN
script:
- oc start-build my_app
How can I let OpenShift use this repository, plus the artifacts created in my pipeline?
In general OpenShift build pipelines rely on the s2i build process to build applications.
The best practice for reusing artifacts between s2i builds would either be through using incremental builds or chaining multiple BuildConfig definitions (the output image of one BuildConfig being fed as source image into another BuildConfig) together via the spec.source.images or spec.source.git configuration in the BuildConfig definition.
In your case since you are using a Jenkins pipeline to generate your artifacts instead of the OpenShift build process you really only need to combine your artifacts with your source code and the runtime container image.
To do this you might create a builder container image that pulls those artifacts down from an external source during the assemble phase (via curl, wget, etc) of the s2i workflow. You could then configure your BuildConfig to point at your source repository. At build time the BuildConfig will pull down your source code and the assemble script will pull down your artifacts.
I'm working on a Node.js application for which my current Dockerfile looks like this:
# Stage 0
# =======
FROM node:10-alpine as build-stage
WORKDIR /app
COPY package.json yarn.lock ./
RUN yarn install
COPY . ./
RUN yarn build
# Stage 1
# =======
FROM nginx:mainline-alpine
COPY --from=build-stage /app/build /usr/share/nginx/html
I'd like to integrate this into a GitLab CI pipeline but I'm not sure if I got the basic idea. So far I know that I need to create a .gitlab-ci.yml file which will be later picked up by GitLab.
My basic idea is:
I push my code changes to GitLab.
GitLab builds a new Docker image based on my Dockerfile.
GitLab pushes this newly create image to a "production" server (later).
So, my question is:
My .gitlab-ci.yml should then contain something like a build job which triggers... what? The docker build command? Or do I need to "copy" the Dockerfile content to the CI file?
GitLab CI executes the pipeline in the Runners that need to be registered into the project using generated tokens (Settings/CI CD/Runners). You also can used Shared Runners for multiple projects. The pipeline is configured with the .gitlab-ci.yml file and you can build, test, push and deploy docker images using the yaml file, when something is done in the repo (push to branch, merge request, etc).
It’s also useful when your application already has the Dockerfile that
can be used to create and test an image
So basically you need to install the runner, register it with the token of your project (or use Shared Runners) and configure your CI yaml file. The recommended aproach is docker in docker but it is up to you. You can also check this basic example. Finally you can deploy your container directly into Kubernetes, Heroku or Rancher. Remember to safely configure your credentials and secrets in Settings/Variables.
Conclusion
GitLab CI is awesome, but I recommend you to firstly think about your git workflow to use in order to set the stages in the .gitlab-ci.yml file. This will allow you to configure your node project as a pipeline an then it would be easy to export to other tools such as Jenkins pipelines or Travis for example.
build job trigger:
option 1:
add when: manual in the job and you can run the job by manual in CI/CD>Pipelines
option 2:
only:
- <branchname>
in this case the job start when you push into the defined branch
(this my personal suggest)
option 3:
add nothin' and the job will run every time when you push code
Of corse you can combine the options above.
In addition may star the job with web request by using the job token.
docker build command will work in pipeline. I think in script section.
Requirements docker engine on the gitlab-runner which pick the job.
Or do I need to "copy" the Dockerfile content to the CI file?
no