Use terraform for google cloud run - terraform

Im looking into using terraform to automate setting up an environment for demos.
Works for VM instance and can be fully automated but management prefers to use Cloud run with Docker containers.
When I read this article it starts with manually having to build and register a docker container. I don't get that step, why can't that be automated as well with terraform?

Terraform is a deployment tool. More or less, it invokes API to build, update or delete things. So now what do you want to do? To take a container and to deploy it on Cloud Run. Build sources, uploading files, perform git clone aren't actions designed for Terraform.
It's not surprising to have a CI pipeline that build things and at the end a CD tool called for the deployment.

Related

Is there any equivalent for GitLab CI/CD's environment file in GCP cloud build?

I'm using GitLab's CI/CD for building and deploying my application. As my deployment environment is GCP, I'm trying to use GCP cloud build to build and deploy my application to the cloud. In GitLab, I was using it's environment file feature to store certain critical files. Is there any alternative for that in GCP's cloud build?
I'm storing some configuration files and some other files as environment file. How can I completely replace GitLab CI/CD with GCP Cloud build?
You can in fact use Cloud Build for CI/CD and GitHub for Source Code Management as you usually do using GitLab, in that way you can build and deploy your apps on GCP and still be able to manage your source code.
In this guide of Similarities and differences between GitLab CI and Cloud Build, there is an explanation on how to build and deploy your code with Cloud Build. The steps in a few words would be:
Connect Cloud Build to your source repository
Select the repository where your source code is with authentication
Add a trigger to automate the builds using a Dockerfile or Cloud Build config file in YAML format
Also, a brief explanation of some important fields:
The steps field in the config file specifies a set of steps that you want Cloud Build to perform.
The id field sets a unique identifier for a build step.
The name field of a build step specifies a cloud builder, which is a container image running common tools. You use a builder in a build step to execute your tasks.
The args field of a build step takes a list of arguments and passes them to the builder referenced by the name field.
The entrypoint in a build step specifies an entrypoint if you don't want to use the default entrypoint of the builder.
To store config files, you can refer to Storing images in Container Registry:
You can configure Cloud Build to store your built image in one of the following ways:
using the images field, which stores the image in the Container Registry after your build completes.
using the docker push command, which stores the image in the Container Registry as part of your build flow.
If your build produces artifacts like binaries or tarballs, as mentioned in Storing build artifacts, you can store them in Cloud Storage.
In case you want to use variables, I would refer to the site mentioned in a previous comment.

How to maintain many Azure resources and deployments in one git repo?

I have a project that consists of an Azure webapp, a PostgreSQL on Azure, and multiple Azure functions for background ETL workflows. I also have a local Python package that I need to access from both the webapp and the Azure functions.
How can I structure configuration and script deployment for those resources from a single git repo?
Any suggestions or pointers to good examples or tutorials would be very appreciated.
All the Azure tutorials that I've seen are only for small and simple projects.
For now, I've hand-written an admin.py script that does e.g. the webapp and function deployments by creating a Python package, creating ZIP files for each resource and doing ZIP deployments. This is getting messy, and now I want to have QA and PROD versions, and I need to pass secrets so that the DB is reachable, and it's getting more complex. Is there either a nice way to structure this packaging / deployment, or a tool to help with it? For me, putting everything in Kubernetes is not the solution, at least the DB already exists. Also, Azure DevOps is not an option, we are using Gitlab CI, so eventually I want to have a solution that can run on CI/CD there.
Not sure if this will help complete but here we go.
Instead of using a hand-written admin.py script, try using a yaml pipeline flow. For Gitlab, they have https://docs.gitlab.com/ee/ci/yaml/ that you can use to get started. From what you've indicated, I would recommend having several job steps in your yaml pipeline that will build and package your web and function apps. For deployment, you can make use of environments. Have a look at https://docs.gitlab.com/ee/ci/multi_project_pipelines.html as well which illustrates how you can create downstream pipelines.
From a deployment standpoint, the current integration I've found between Azure and GitLab leaves me with two recommendations:
Leverage the script command of yaml to continue zipping your artifacts use Azure CLI (I would assume you can install the tools during the pipeline) to zip deploy.
Keep your code inside the GitLab repo and utilize Azure Pipelines to handle the CI/CD for you.
I hope you find this helpful.

Configure sonarqube scanner to run as a step in cloud build on cloud source repository

I've installed sonarqube bitnami instance and manually installed sonar-scanner looking for a way to trigger the scan from cloud build, Is there a way I can reference this sonarqube VM instance in my cloudbuild.yaml (Don't want to use docker)
Like for example using container its pulled like (gcr.io/project-id/sonar-scanner:latest) I want it to be pulled from that bitnami VM instance.
It wouldn't be possible to use a VM instance to perform a build step. All steps that carried out on Cloud Build are performed using container images that are pulled and run on a single VM.
What you might be able to do is create an image that replicates as closely as possible the environment you have on that VM and include it as a custom build step.

Upload custom exe to Azure Devops pipeline

I'd like to execute a command line app that I created as part of my CI builds.
The command line app is not related to the repository on which the build is running. (Basically, its a tool that is supposed to send some project related metadata to a company web application to gather some metrics).
It seems I can execute it using a Command Line task, as explained here:
How to execute exe in VSTS Release pipeline empty process
The question is, however - how do I upload my custom tool into Azure Devops?
I know it's possible to create custom tasks (https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-build-task?view=azure-devops) , but it seems quite a lot of effort to create, especially given that I have a console tool that is doing what I need.
I do this by including a seperate deployment folder in the git repo with my source code. This folder contains all the extra stuff (exe's) that I need to call to deploy.
You could if you wanted keep these artifacts in a different repo and/or different artifact if you wish as you can get any number of different artifacts
Then I include the deployment folder when publishing artifacts. In your build step you pull down the artifacts and it includes your EXE.
You're meant to be able to use things like NPM to install helper libraries on the fly but none of my required libraries were ever supported.
You can also use a self hosted agent which is your own host (Often an Azure VM). You install everything you need on there then you install a DevOps self hosted agent which lets build pipelines use it.
Create a build of your exe
Upload your exe to blob storage.
Create a SAS token to access your blob.
In your build create a task with a PowerShell script. In the PS script download your exe (unzip), and copy it to Build.StagingDirectory/"yourToolFolder". Then in your PS script run it. You probably want to pass it arguments like the location of the repo on the build agent.
A way to achieve this involve create a deployment group and add a server to the group where you have access and privileges to upload your console. it could be onprem or cloud depends in your requirements.

Azure DevOps Release Pipeline: How to copy to my VM

I created a simple .net core console app. This app's repository is a Azure DevOps one. I have also created a ubuntu vm which i can successfully connect to, to receive the deploy.
I have managed to deploy my app from my local computer, by cloning, building and pushing it (via scp command).
Now I would like to do this using azure devops pipeline?
I managed to build the app, but now i can't seem to find help regarding how to execute the scp (or a alternative) command...
Edit1:
Ok, this is turning out to be an order of magnitude harder than I expected. I'm giving up for now. I've been trying to figure this out for almost 2 work-days. I can't believe that a task that requires 4-6 commands on a script on my local machine should require this much effort to do on a devops environment...
You can configure a deploy agent to your VM and use a release management to copy and configure your applications:
Deploy an agent on Linux
Define your multi-stage continuous deployment (CD) pipeline
Have a look at the Copy files over SSH pipeline task which supports SCP.

Resources