Is there any equivalent for GitLab CI/CD's environment file in GCP cloud build? - gitlab

I'm using GitLab's CI/CD for building and deploying my application. As my deployment environment is GCP, I'm trying to use GCP cloud build to build and deploy my application to the cloud. In GitLab, I was using it's environment file feature to store certain critical files. Is there any alternative for that in GCP's cloud build?
I'm storing some configuration files and some other files as environment file. How can I completely replace GitLab CI/CD with GCP Cloud build?

You can in fact use Cloud Build for CI/CD and GitHub for Source Code Management as you usually do using GitLab, in that way you can build and deploy your apps on GCP and still be able to manage your source code.
In this guide of Similarities and differences between GitLab CI and Cloud Build, there is an explanation on how to build and deploy your code with Cloud Build. The steps in a few words would be:
Connect Cloud Build to your source repository
Select the repository where your source code is with authentication
Add a trigger to automate the builds using a Dockerfile or Cloud Build config file in YAML format
Also, a brief explanation of some important fields:
The steps field in the config file specifies a set of steps that you want Cloud Build to perform.
The id field sets a unique identifier for a build step.
The name field of a build step specifies a cloud builder, which is a container image running common tools. You use a builder in a build step to execute your tasks.
The args field of a build step takes a list of arguments and passes them to the builder referenced by the name field.
The entrypoint in a build step specifies an entrypoint if you don't want to use the default entrypoint of the builder.
To store config files, you can refer to Storing images in Container Registry:
You can configure Cloud Build to store your built image in one of the following ways:
using the images field, which stores the image in the Container Registry after your build completes.
using the docker push command, which stores the image in the Container Registry as part of your build flow.
If your build produces artifacts like binaries or tarballs, as mentioned in Storing build artifacts, you can store them in Cloud Storage.
In case you want to use variables, I would refer to the site mentioned in a previous comment.

Related

How to get Docker Images from Azure container registry and run onpremise servers using docker-compose?

I am using azure devops for CI/CD pipelines and one thing I couldn't find out, getting images from Azure container registry and run those images inside of my onpremise debian server. (I build and pushed the images to ACR already).
On release pipeline section:
I set Azure Container repository as source type and selected the latest artifact from there
then created a dev stage
Added a Command line tool to execute some scripts on server
-->This is the step where I lost<-- I need somehow to get these image from repository and fire compose up command in the server.
Extra info: you cannot use your docker-compose.yml(which mine is inside of the azure repo in same project) because you are using ACR as source type, not the build itself or source code.
*in my humble opinion this is a good case to enrich a content. Because there is lack of docs and videos on internet.
thanks,

Use terraform for google cloud run

Im looking into using terraform to automate setting up an environment for demos.
Works for VM instance and can be fully automated but management prefers to use Cloud run with Docker containers.
When I read this article it starts with manually having to build and register a docker container. I don't get that step, why can't that be automated as well with terraform?
Terraform is a deployment tool. More or less, it invokes API to build, update or delete things. So now what do you want to do? To take a container and to deploy it on Cloud Run. Build sources, uploading files, perform git clone aren't actions designed for Terraform.
It's not surprising to have a CI pipeline that build things and at the end a CD tool called for the deployment.

How to maintain many Azure resources and deployments in one git repo?

I have a project that consists of an Azure webapp, a PostgreSQL on Azure, and multiple Azure functions for background ETL workflows. I also have a local Python package that I need to access from both the webapp and the Azure functions.
How can I structure configuration and script deployment for those resources from a single git repo?
Any suggestions or pointers to good examples or tutorials would be very appreciated.
All the Azure tutorials that I've seen are only for small and simple projects.
For now, I've hand-written an admin.py script that does e.g. the webapp and function deployments by creating a Python package, creating ZIP files for each resource and doing ZIP deployments. This is getting messy, and now I want to have QA and PROD versions, and I need to pass secrets so that the DB is reachable, and it's getting more complex. Is there either a nice way to structure this packaging / deployment, or a tool to help with it? For me, putting everything in Kubernetes is not the solution, at least the DB already exists. Also, Azure DevOps is not an option, we are using Gitlab CI, so eventually I want to have a solution that can run on CI/CD there.
Not sure if this will help complete but here we go.
Instead of using a hand-written admin.py script, try using a yaml pipeline flow. For Gitlab, they have https://docs.gitlab.com/ee/ci/yaml/ that you can use to get started. From what you've indicated, I would recommend having several job steps in your yaml pipeline that will build and package your web and function apps. For deployment, you can make use of environments. Have a look at https://docs.gitlab.com/ee/ci/multi_project_pipelines.html as well which illustrates how you can create downstream pipelines.
From a deployment standpoint, the current integration I've found between Azure and GitLab leaves me with two recommendations:
Leverage the script command of yaml to continue zipping your artifacts use Azure CLI (I would assume you can install the tools during the pipeline) to zip deploy.
Keep your code inside the GitLab repo and utilize Azure Pipelines to handle the CI/CD for you.
I hope you find this helpful.

How to deploy python flask application on linux web app service through azure portal?

I am trying to deploy my flask application on Linux web apps.
I want to set a azure pipeline for my code which is pushed on an azure repository.
I have made all the configuration changes in my python code and created a web app with runtime stack of Python 3.7
As soon as I go to deployment center to deploy my code, after selecting the azure repository as the source of my code, I am redirected to an option of azure pipelines where we have to configure our build settings.
But the build does not gives any option for Python. It just gives me four build options such as Node, Ruby, Asp.Net and php.
I cannot use :
- Docker
- Git
With such limitations I have found no suitable tutorial to do the same.
Can someone tell me some way to set the pipeline for my python project ?
Azure DevOps CI/CD works with any language, platform and cloud. Just for Python application, you may need add additional steps to achieve the deployment from Azure Devops CI/CD.
CI
Since the python is an interpreted language, do not need the compilation. If none of other step, like test, just need use 2 tasks in CI pipeline: Archive Files task and Publish Build Artifacts task.
The Archive Files task used to pack the python application source folder into a zip package to use it in CD. And Publish Build Artifacts task will publish this zip package to Release pipeline.
BUT,
If your project contains and needs test, please add another Command line task to run the test by using pytest.
But, in azure devops, you need configure the python environment with some tasks if you want to use python component like pytest.
Here please refer to this blog.
Note: Since the stack you are using is Python 3.7, please specify the python version as 3.x in the Using Python task.
CD
Since you have create the app service in Azure portal, just skip step 4(Add Azure CLI task) in Exercise 3: Configure Release pipeline which shown in this blog, because Step 4 just used to create a new Azure Resources.
1. To deploy the python application, you need add the Azure App Service manage task first to install the corresponding python version site extension in release pipeline:
It would install set of corresponding tools to support to manage your app service.
2. Next you could use Azure App Service deploy to deploy the zip package which created in the Build pipeline, to the app service you configured in Azure portal.
After specified the subscription in this task, the app service will automatically display in the drop list of App Service name:
Then specified the path which you configured in the publish task of the build pipeline. Replace $(Build.ArtifactStagingDirectory) as $(System.DefaultWorkingDirectory), and replace $(Build.BuildId) as * to search the zip package by using the fuzzy search.

Upload custom exe to Azure Devops pipeline

I'd like to execute a command line app that I created as part of my CI builds.
The command line app is not related to the repository on which the build is running. (Basically, its a tool that is supposed to send some project related metadata to a company web application to gather some metrics).
It seems I can execute it using a Command Line task, as explained here:
How to execute exe in VSTS Release pipeline empty process
The question is, however - how do I upload my custom tool into Azure Devops?
I know it's possible to create custom tasks (https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-build-task?view=azure-devops) , but it seems quite a lot of effort to create, especially given that I have a console tool that is doing what I need.
I do this by including a seperate deployment folder in the git repo with my source code. This folder contains all the extra stuff (exe's) that I need to call to deploy.
You could if you wanted keep these artifacts in a different repo and/or different artifact if you wish as you can get any number of different artifacts
Then I include the deployment folder when publishing artifacts. In your build step you pull down the artifacts and it includes your EXE.
You're meant to be able to use things like NPM to install helper libraries on the fly but none of my required libraries were ever supported.
You can also use a self hosted agent which is your own host (Often an Azure VM). You install everything you need on there then you install a DevOps self hosted agent which lets build pipelines use it.
Create a build of your exe
Upload your exe to blob storage.
Create a SAS token to access your blob.
In your build create a task with a PowerShell script. In the PS script download your exe (unzip), and copy it to Build.StagingDirectory/"yourToolFolder". Then in your PS script run it. You probably want to pass it arguments like the location of the repo on the build agent.
A way to achieve this involve create a deployment group and add a server to the group where you have access and privileges to upload your console. it could be onprem or cloud depends in your requirements.

Resources