I'm starting to use gitlab CI/CD pipeline but have some doubts regarding the output of the building process if i was to have a project(Repo) and inside this project I have the front and backend separated by the project structure, ex:
CarProject
.gitlab-ci.yml
|__FrontEndCarProject
|__BackendCarProject
let's say that every time I change something in the frontend I would need to build it and deploy it to S3, but there is no need to build the backend (java application) and deploy it to elastic beanstalk (and vice versa for when i change the backend)..Is there a way to check where the changes have been made(FrontEndCarProject/BackendCarProject) using GitLab and redirect the .gitlab-ci.yml to a script file depending on if a have to deploy to S3 or elastic beanstalk?
Just trying
Note: another way is just to manually change the yml file depending on where i want to deploy..but is there a way to autodetect this and automated?
.gitlab-ci.yml
Just to get the idea, heres an example that would run in a linear way, but how can i conditionally build/deploy(depending on my front or backend)? should i keep them in different repos for simplicity? is it a good practice?
variables:
ARTIFACT_NAME: cars-api-v$CI_PIPELINE_IID.jar
APP_NAME: cars-api
stages:
- build
- deploy
# ONLY Build when front(FrontendCarProject) in changed
build_front:
stage: build
image: Node:latest
script:
- npm install
artifacts:
paths:
- ./dist
# ONLY build when backend(BackendCarProject) is changed
build_back:
stage: build
image: openjdk:12-alpine
script:
- ./gradlew build
artifacts:
paths:
- ./build/libs/
# ONLY deploy when front(FrontendCarProject) in changed
deploy_s3:
stage: deploy
image:
name: python:latest
script:
- aws configure set region us-east-1
- aws s3 cp ./build/libs/cars-api.jar s3://$S3_BUCKET/cars-api.jar
# ONLY deploy when backend(BackendCarProject) is changed
deploy_back_end:
stage: deploy
image:
name: banst/awscli
script:
- aws configure set region us-east-1
- aws s3 cp ./build/libs/$ARTIFACT_NAME s3://$S3_BUCKET/$ARTIFACT_NAME
- aws elasticbeanstalk create-application-version --application-name $APP_NAME --version-label $CI_PIPELINE_IID --source-bundle S3Bucket=$S3_BUCKET,S3Key=$ARTIFACT_NAME
- aws elasticbeanstalk update-environment --application-name $APP_NAME --environment-name "production" --version-label=$CI_PIPELINE_IID
If your frontend and backend can be built and deployed seperately, than you can use rules:changes to check if a change happened and need:optional to only deploy the respective built libraries.
variables:
ARTIFACT_NAME: cars-api-v$CI_PIPELINE_IID.jar
APP_NAME: cars-api
stages:
- build
- deploy
# ONLY Build when front(FrontendCarProject) in changed
build_front:
stage: build
image: Node:latest
script:
- npm install
rules:
- changes:
- FrontEndCarProject/*
artifacts:
paths:
- ./dist
# ONLY build when backend(BackendCarProject) is changed
build_back:
stage: build
image: openjdk:12-alpine
script:
- ./gradlew build
rules:
- changes:
- BackendEndCarProject/*
artifacts:
paths:
- ./build/libs/
# ONLY deploy when front(FrontendCarProject) in changed
deploy_s3:
stage: deploy
image:
name: python:latest
script:
- aws configure set region us-east-1
- aws s3 cp ./build/libs/cars-api.jar s3://$S3_BUCKET/cars-api.jar
needs:
- job: build_front
artifacts: true
optional: true
# ONLY deploy when backend(BackendCarProject) is changed
deploy_back_end:
stage: deploy
image:
name: banst/awscli
script:
- aws configure set region us-east-1
- aws s3 cp ./build/libs/$ARTIFACT_NAME s3://$S3_BUCKET/$ARTIFACT_NAME
- aws elasticbeanstalk create-application-version --application-name $APP_NAME --version-label $CI_PIPELINE_IID --source-bundle S3Bucket=$S3_BUCKET,S3Key=$ARTIFACT_NAME
- aws elasticbeanstalk update-environment --application-name $APP_NAME --environment-name "production" --version-label=$CI_PIPELINE_IID
needs:
- job: build_back
artifacts: true
optional: true
Related
I'm using a GitHub Action to build and deploy a Vue Azure Static Web App. When using the default template, my staticwebapp.config.json file which is at the root of the Vue app gets applied correctly and I see Copying 'staticwebapp.config.json' to build output logged.
When using a customized GitHub workflow (shown below) to separate the build and deploy steps which has skip_app_build set to true, the artifact that gets uploaded/downloaded does not contain the staticwebapp.config.json file.
How can I modify the GitHub action to make sure the staticwebapp.config.json file gets copied to the output directory so that it gets deployed?
jobs:
build:
if: github.event_name == 'push'
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout#v2
- name: Setup Node.js
uses: actions/setup-node#v3
- name: npm install and run build
run: npm install && npm run build
- name: Upload artifact
uses: actions/upload-artifact#v3.1.0
with:
name: app
path: dist/
deploy:
runs-on: ubuntu-latest
needs: build
steps:
- name: Download artifact
uses: actions/download-artifact#v3.0.0
with:
name: app
- name: Deploy to Azure
id: deploy
uses: Azure/static-web-apps-deploy#v1
with:
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN_BLUE_STONE_0BAB0F910 }}
repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for GitHub integrations (i.e. PR comments)
action: "upload"
###### Repository/Build Configurations ######
app_location: "" # App source code path relative to repository root
api_location: "" # Api source code path relative to repository root - optional
skip_app_build: true
###### End of Repository/Build Configurations ######
I was able to solve this by moving the staticwebapp.config.json file to the public directory of the Vue app. This made it so that file was in the published artifact.
After doing that, I was able to see Using staticwebapp.config.json file for configuration information in the logs during the static-web-apps-deploy step.
I have 3 branches dev, qa and prod and 3 corresponding environments development, testing and production. So when code is merged into specific branch I want to build only that branch and deploy to corresponding environment
dev -> development
qa -> testing
prod -> production
dev branch is the default branch for the repository.
Using only attribute, I was able to deploy to specific environment based on which branch the code is merged.
But in the build stage I am not able to figure out, how to tell gitlab to pull specific branch where the code is checked in.
When cicd pipeline pulls the code on the runner, is it always going to pull the code from the default branch or is it going to pull the code from the branch where the code checked in?
Here is my current YAML
default:
image: node:14
tags:
- my-runner
stages:
- build
- deploy
build-job:
stage: build
script:
- npm install
- npm run build:prod
- echo "Compile complete."
artifacts:
paths:
- deploy/build.zip
deploy-dev:
image: docker.xxxx/awscli
stage: deploy
environment:
name: development
script:
- aws s3 cp deploy/build.zip s3://dev-bucket
- aws lambda update-function-code --function-name dev-lambda --s3-bucket dev-bucket --s3-key build.zip --region us-west-2
only:
- dev
deploy-testing:
image: docker.xxxx/awscli
stage: deploy
environment:
name: testing
script:
- aws s3 cp deploy/build.zip s3://qa-bucket
- aws lambda update-function-code --function-name qa-lambda --s3-bucket qa-bucket --s3-key build.zip --region us-west-2
only:
- qa
deploy-production:
image: docker.xxxx/awscli
stage: deploy
environment:
name: production
script:
- aws s3 cp deploy/build.zip s3://production-bucket
- aws lambda update-function-code --function-name production-lambda --s3-bucket production-bucket --s3-key build.zip --region us-west-2
only:
- prod
Generally speaking, you don't. Pipelines inherently belong to a specific git branch/ref and the GitLab runner clones the appropriate ref automatically before the build begins.
For example, if you push a single commit to the master branch, this will trigger a pipeline belonging to the HEAD (the new commit) on the master branch for that push event. When jobs in this pipeline run, the GitLab runner will checkout this specific ref automatically for each job. You can see this clearly in the pipeline UI: each pipeline is associated with a specific branch/ref:
In other words, it is this association between the ref and pipeline that determines how the runner checks out the relevant code from the repository. You do not need to specify it in your .gitlab-ci.yml.
Hello I have a typescript server with a build script that looks like
"`build": "rm -rf build && tsc && cp package*.json build && cp Dockerfile build && npm ci --prefix build --production"`
This creates a new build directory and copies the Dockerfile to the build directory, so the deployed application should be run on the build directory.
I want to automate deployment to Cloud Run using github workflows so I created a .yaml file but during the run portion I am confused how I can build the docker image and push it from my build directory
- name: Enable the necessary APIs and enable docker auth
run: |-
gcloud services enable containerregistry.googleapis.com
gcloud services enable run.googleapis.com
gcloud --quiet auth configure-docker
- name: Build and tag image
run: |-
docker build . --tag "gcr.io/$CLOUD_RUN_PROJECT_ID/$REPO_NAME:$GITHUB_SHA"
- name: Push image to GCR
run: |-
docker push gcr.io/$CLOUD_RUN_PROJECT_ID/$REPO_NAME:$GITHUB_SHA
My question is how can I insure to run the docker commands from the build directory ?
On the docker build command, replace the . with build/.
Here's a a full reference of an example workflow including the step to deploy the image to Cloud Run.
on:
push:
branches:
- example-build-deploy
name: Build and Deploy a Container
env:
PROJECT_ID: ${{ secrets.GCP_PROJECT }}
SERVICE: hello-cloud-run
REGION: us-central1
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Setup Cloud SDK
uses: google-github-actions/setup-gcloud#v0
with:
project_id: ${{ env.PROJECT_ID }}
service_account_key: ${{ secrets.GCP_SA_KEY }}
export_default_credentials: true # Set to true to authenticate the Cloud Run action
- name: Authorize Docker push
run: gcloud auth configure-docker
- name: Build and Push Container
run: |-
docker build -t gcr.io/$CLOUD_RUN_PROJECT_ID/$REPO_NAME:$GITHUB_SHA build/
docker push gcr.io/$CLOUD_RUN_PROJECT_ID/$REPO_NAME:$GITHUB_SHA
- name: Deploy to Cloud Run
id: deploy
uses: google-github-actions/deploy-cloudrun#v0
with:
service: ${{ env.SERVICE }}
image: gcr.io/$CLOUD_RUN_PROJECT_ID/$REPO_NAME:$GITHUB_SHA
region: ${{ env.REGION }}
- name: Show Output
run: echo ${{ steps.deploy.outputs.url }}
You may also check the full Github repository sample here.
Im trying to find out if there is a way to exclude certain files from being sent over github actions, for example, i have a server and a client in the same repository. right now, both the server (node.js) and the client (its a react.js application) are being hosted together on azure app services. once the / is hit, it serves up the index.html file from the build folder.
however I am finding that hosting these two things together is taking its toll on the overall application, for example, it sometimes takes up to 10 seconds for the server to respond and return the index file to the client. I remember in my training some of my more senior devs didnt like to host the server and client together, and im starting to see why..
so I likely will need to split these up to improve performance, but before i go through a daunting task of splitting the repositories up. is there a way to specify in github actions in a workflow to ignore certain files/folders etc..
the only modification i've made to this is that i added an action to zip the application for faster upload to azure to improve workload performance.
here is my workflow:
# Docs for the Azure Web Apps Deploy action: https://github.com/Azure/webapps-deploy
# More GitHub Actions for Azure: https://github.com/Azure/actions
name: Build and deploy Node.js app to Azure Web App
on:
push:
branches:
- main
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Set up Node.js version
uses: actions/setup-node#v1
with:
node-version: '14.x'
- name: npm install, build, and test
run: |
npm install
npm run build --if-present
npm run test --if-present
- name: Zip artifact for deployment
run: zip release.zip ./* -r
- name: Upload artifact for deployment job
uses: actions/upload-artifact#v2
with:
name: node-app
path: release.zip
deploy:
runs-on: ubuntu-latest
needs: build
environment:
name: 'Production'
url: ${{ steps.deploy-to-webapp.outputs.webapp-url }}
steps:
- name: Download artifact from build job
uses: actions/download-artifact#v2
with:
name: node-app
- name: unzip artifact for deployment
run: unzip release.zip
- name: 'Deploy to Azure Web App'
id: deploy-to-webapp
uses: azure/webapps-deploy#v2
with:
app-name: 'Omitted'
slot-name: 'Production'
publish-profile: ${{SECRET}}
package: .
You could create a shell script that excludes the files you don't want.
In .github, create a new folder scripts. Inside the scripts folder, create a new file named exclude.sh.
In the exclude.sh, add the following:
zip -r [file_name.zip] [files/folder to zip] -x [file path/name to exclude]
In your workflow:
- name: unzip artifact for deployment
run: unzip file_name.zip
I have an issue with my Bitbucket CI/CD pipeline. The pipeline itself runs fine, but the application is broken when I try to access it. The pipeline deploys a React App Engine Node.js application. The problem comes when I access the site. This is the error I receive in Google Logging "Static file referenced by handler not found: build/index.html".
If I deploy the application manually, I have no issues and the application works fine. This application error only occurs if the deployment happens in the bitbucket pipeline.
Here is the app.yaml
runtime: nodejs12
handlers:
# Serve all static files with url ending with a file extension
- url: /(.*\..+)$
static_files: build/\1
upload: build/(.*\..+)$
# Catch all handler to index.html
- url: /.*
static_files: build/index.html
upload: build/index.html
Here is the bitbucket-pipelines.yml
pipelines:
branches:
master:
- step:
name: NPM Install and Build
image: node:14.15.1
script:
- npm install
- unset CI
- npm run build
- step:
name: Deploy to App Engine
image: google/cloud-sdk
script:
- gcloud config set project $GCLOUD_PROJECT
- 'echo "$GOOGLE_APPLICATION_CREDENTIALS" > google_application_credentials.json'
- gcloud auth activate-service-account --key-file google_application_credentials.json
- gcloud app deploy app.yaml
Any help would be greatly appreciated. Thank you so much.
Bitbucket pipelines does not save artifacts between steps. You need to declare an artifacts config in the build step so that you can reference it in the deploy step. Something like this:
pipelines:
branches:
master:
- step:
name: NPM Install and Build
image: node:14.15.1
script:
- npm install
- unset CI
- npm run build
artifacts: # Declare artifacts here for later steps
- build/**
- step:
name: Deploy to App Engine
image: google/cloud-sdk
script:
- gcloud config set project $GCLOUD_PROJECT
- 'echo "$GOOGLE_APPLICATION_CREDENTIALS" > google_application_credentials.json'
- gcloud auth activate-service-account --key-file google_application_credentials.json
- gcloud app deploy app.yaml
See here for more details: https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/
Note that I have not tested this.