GCP Gitlab CI Cloud Function NodeJs Deployment failed: RESOURCE_ERROR 400 - node.js

When trying to deploy a cloud function on gcp using SLS I receive the following exception
{"ResourceType":"gcp-types/cloudfunctions-v1:projects.locations.functions","ResourceErrorCode":"400","ResourceErrorMessage":"Build failed: Build error details not available."}

The solution was to define in .gitlab-ci.yml the specific version of the image that the ci is running in by specifying the following key value: image: node:12-alpine at the top of the .gitlab-ci.yml.

Related

Gitlab CI/CD elastic beanstalk

A few months ago I did a proof of concept for a CI/CD pipeline on gitlab for my dotnet core API deployed on AWS elastic beanstalk. At the time I got it to work.
I'm now attempting to actually implement the solution properly, but it's no longer working. The appropriate code in my yml file is as follows:
stages:
- deploy_testing
deploy-testing-job:
stage: deploy_testing
image: mcr.microsoft.com/dotnet/sdk:6.0
before_script:
- dotnet tool install -g Amazon.ElasticBeanstalk.Tools
- export PATH="$PATH:/root/.dotnet/tools"
- apt update
- apt-get --assume-yes install zip
script:
- dotnet restore
- dotnet eb deploy-environment --profile my_profile --configuration Release --framework net6.0-windows --project-location "MyProject.Api/" --self-contained true --region $AWS_REGION --application $APP_NAME --environment $APP_ENV_NAME --solution-stack "64bit Windows Server Core 2019 v2.11.0 running IIS 10.0"
Now I started getting an error:
Error creating Elastic Beanstalk application: User: arn:aws:iam::1234:user/gitlab is not authorized to perform: elasticbeanstalk:CreateApplication on resource: arn:aws:elasticbeanstalk:my-region:1234:application/my-application
I found this strange since I confirmed that I correctly identified both an existing application and environment.
Just to see what would happen, I temporarily attached Create Application permission to the appropriate gitlab-elastic-beanstalk policy and now I get the error:
Error creating Elastic Beanstalk application: Application my-application already exists.
Why is eb deploy-environment attempting to recreate the whole application / environment?

Can't use GitLab Pipelines with Vercel?

I use gitlab pipelines with Vercel. After following the template of Vercel CLI:
https://vercel.com/guides/how-can-i-use-gitlab-pipelines-with-vercel
This is my .gitlab-ci.yml:
.gitlab-ci.yml
After run cicd by gitlab, it gives me the error:
Error

Python Azure Function with CiCd Pipeline : ModuleNotFound Error

Hello,
I got an error while deploying azure python function using cicd pipeline. The error is ModuleNotFoundError: No module named 'azure.servicebus'. But the case is, service-bus module already installing while building the package on cicd pipeline.
When direct deploying using vscode or azure cli, the Function app works fine without any error.
working environment and function app python version - 3.7
azure function app version - 3.x
cicd pipeline agent specification - ubuntu-latest
Error -
This error occurs when a function is executed. The function app was deployed using the CiCd pipeline and all dependencies are include in requirements.txt as shown below.
requirements.txt
Here is the pipeline build bash script
Could anyone please help?
I am able to reproduce your issue:
And I can fix this issue:
I know what you are doing. Create virtual environment, active virtual environment, install python packages to the virtual environment...
And you mentioned VS Code/CLI deploy is no problem.
Yes, as you observe, everything should be installed successfully to the virtual environment.
So Why?
The reason is both of VS Code Deploy and CLI deploy doesn't have any relationship with virtual environment, these deployment method only care the requirements.txt file, they don't check others.
And, those operations you done is only work on current agent machine, azure function app you deploy to is another situation. It is a new machine, a new environment.
So you just need to simply design the pipeline like this:
trigger:
- none
pool:
VMAS
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD'
ScriptType: 'InlineScript'
Inline: 'func azure functionapp publish <Your FunctionApp Name> --build remote'
azurePowerShellVersion: 'LatestVersion'
Working principle of VS Code and CLI
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python?tabs=asgi%2Capplication-level#remote-build
By default, Azure Functions Core Tools requests a remote build when
you use the following func azure functionapp publish command to
publish your Python project to Azure. Replace <APP_NAME> with the name
of your function app in Azure.
The Azure Functions extension for Visual Studio Code also requests a
remote build by default.

Setup CICD using Google cloud run and GITlab

I am very new to the CICD.
I have to set up a pipeline to connect the GitLab repo to the cloud run.
I have currently hosted my website on cloud run and code in GitLab using the manual command.
I have tried to mind many documents and vedios but those are not very clear or I am not able to understand them. If anyone can provide me good documents or guide me, il really appreciate it.
Here's my solution for your problem:
You have to configure your Google Cloud projects:
Enable Google Cloud Run API and Cloud Build API services.
Create a Google Service Account with the correct permissions (Cloud Build Service Agent, Service Account User, Cloud Run Admin and Viewer)
Generate a credential file from your Service Account, it will output a JSON.
Setup Gitlab CI/CD variables: GCP_PROJECT_ID (with your project id) and GCP_SERVICE_ACCOUNT (with the content of your previous generated JSON).
Setup your .gitlab-ci.yml like this:
variables:
SERVICE_NAME: 'your-service-id'
image: google/cloud-sdk:latest
before_script:
- apt-get --assume-yes install npm
- npm install
- npm run build
deploy:
stage: deploy
only:
- master
script:
- echo $GCP_SERVICE_ACCOUNT > gcloud-service-key.json
- gcloud auth activate-service-account --key-file gcloud-service-key.json
- gcloud auth configure-docker
- gcloud config set project $GCP_PROJECT_ID
- gcloud config set run/region europe-west3
- gcloud run deploy $SERVICE_NAME --source . --allow-unauthenticated
If you have worked with the Gitlab CI/CD (.yml) and Cloud Run (local) before, you will understand the steps easily.
This example is assuming you have a NodeJS project.

Unable to push Docker Container to Azure Kubernetes Service from Jenkins job build

I am new to Azure and Kubernetes and was trying out the following tutorial at https://learn.microsoft.com/en-us/azure/developer/jenkins/deploy-from-github-to-aks#create-a-jenkins-project, however at the last part to deploy the docker to AKS I was unable to do so and faced with errors. I am not familiar with the kubectl set image command and have been going around the web to look for solutions but to no avail. I would appreciate if you could share your knowledge if you have experience the following issue previously.
The following is the configuration: (NOTE: The docker image is able to push to ACR successfully)
The following is the error following the jenkins build job:
Most probably you missed in the initial article you provided the steps, where they deploy app before Jenkin usage.
Look, first of all they Deploy azure-vote-front application to AKS
containers:
- name: azure-vote-front
image: microsoft/azure-vote-front:v1
And of course Jenkins will see this deployment during kubectl set image deployment/azure-vote-front azure-vote-front=$WEB_IMAGE_NAME --kubeconfig /var/lib/jenkins/config
So please, create a deployment first as #mmking and common sense suggest.

Resources