I use gitlab pipelines with Vercel. After following the template of Vercel CLI:
https://vercel.com/guides/how-can-i-use-gitlab-pipelines-with-vercel
This is my .gitlab-ci.yml:
.gitlab-ci.yml
After run cicd by gitlab, it gives me the error:
Error
Related
We have Blazor client side application that we want to deploy as Azure Static Web App.
Unfortunately Static Web App officially support Git Hub Actions and Azure DevOps as deploy options but we are using Bitbucket pipelines currently and switching to GitHub or Azure DevOps is not an option at the moment.
That is why I'm trying to deploy my blazor app to azure using Bitbucket Azure Pipes/azure-static-web-apps-deploy but for some reason it stuck on "Publishing to directory /bin/staticsites/ss-oryx/app..." step and than pipeline fails with timeout error.
Although locally "dotnet publish -c Release" executes for seconds I tried to extend the timeout to one hour with BUILD_TIMEOUT_IN_MINUTES parameter but after one hour timeout again.
Here is my bitbucket-pipelines.yml:
pipelines:
branches:
master:
- step:
name: Deploy to production
deployment: production
script:
- pipe: microsoft/azure-static-web-apps-deploy:main
variables:
APP_LOCATION: '$BITBUCKET_CLONE_DIR/App'
API_TOKEN: $deployment_token
OUTPUT_LOCATION: '$BITBUCKET_CLONE_DIR/wwwroot'
I will be thankful to any hints and suggestions how to solve that...
I have dev repository with build & deploy stages, i want to create a production repository with deploy stage only, i dont want to build there. How to configure this production repository with deploy microservices without building microservices (build stage)? Deploy stage should take images from dev repo & deploy it to production k8s. I need to find how to pass images from dev repo to prod repo?
I am trying to build ci cd pipeline in gitlab for our bitbucket repository. There are two projects in single bitbucket repository frontend (angular) and backend (.net core). I am trying to build gitlab pipeline for that project. But how to write gitlab-ci.yaml to build, test and deploy for ci cd pipeline. There are no option to write .gitlab-ci.yaml for both project. Repository hierarchy is given below:
there are two folder web(frontend project- angular) and backend (backend project- .net core)
You could work with parent-child pipelines.
So in your pipeline you build the frontend part and then call the pipeline for the backend part.
stages:
- frontend
- backend
frontend:
stage: frontend
script:
- echo "build frontend"
backend:
stage: backend
trigger:
include:
- project: 'tutorsinc'
file: '.build-backend.yml'
strategy: depend
When trying to deploy a cloud function on gcp using SLS I receive the following exception
{"ResourceType":"gcp-types/cloudfunctions-v1:projects.locations.functions","ResourceErrorCode":"400","ResourceErrorMessage":"Build failed: Build error details not available."}
The solution was to define in .gitlab-ci.yml the specific version of the image that the ci is running in by specifying the following key value: image: node:12-alpine at the top of the .gitlab-ci.yml.
I have a GitLab CI pipeline which builds a few artifacts. For example:
train:job:
stage: train
script: python script.py
artifacts:
paths:
- artifact.csv
expire_in: 1 week
Now I deploy the repository to OpenShift using the following step in my GitLab pipeline. This will pull my GitLab repo inside OpenShift. It does not include the artifacts from the 'testing'.
deploy:app:
stage: deploy
image: ayufan/openshift-cli
before_script:
- oc login $OPENSHIFT_DOMAIN --token=$OPENSHIFT_TOKEN
script:
- oc start-build my_app
How can I let OpenShift use this repository, plus the artifacts created in my pipeline?
In general OpenShift build pipelines rely on the s2i build process to build applications.
The best practice for reusing artifacts between s2i builds would either be through using incremental builds or chaining multiple BuildConfig definitions (the output image of one BuildConfig being fed as source image into another BuildConfig) together via the spec.source.images or spec.source.git configuration in the BuildConfig definition.
In your case since you are using a Jenkins pipeline to generate your artifacts instead of the OpenShift build process you really only need to combine your artifacts with your source code and the runtime container image.
To do this you might create a builder container image that pulls those artifacts down from an external source during the assemble phase (via curl, wget, etc) of the s2i workflow. You could then configure your BuildConfig to point at your source repository. At build time the BuildConfig will pull down your source code and the assemble script will pull down your artifacts.