Configure sonarqube scanner to run as a step in cloud build on cloud source repository - sonarqube-scan

I've installed sonarqube bitnami instance and manually installed sonar-scanner looking for a way to trigger the scan from cloud build, Is there a way I can reference this sonarqube VM instance in my cloudbuild.yaml (Don't want to use docker)
Like for example using container its pulled like (gcr.io/project-id/sonar-scanner:latest) I want it to be pulled from that bitnami VM instance.

It wouldn't be possible to use a VM instance to perform a build step. All steps that carried out on Cloud Build are performed using container images that are pulled and run on a single VM.
What you might be able to do is create an image that replicates as closely as possible the environment you have on that VM and include it as a custom build step.

Related

How to get Docker Images from Azure container registry and run onpremise servers using docker-compose?

I am using azure devops for CI/CD pipelines and one thing I couldn't find out, getting images from Azure container registry and run those images inside of my onpremise debian server. (I build and pushed the images to ACR already).
On release pipeline section:
I set Azure Container repository as source type and selected the latest artifact from there
then created a dev stage
Added a Command line tool to execute some scripts on server
-->This is the step where I lost<-- I need somehow to get these image from repository and fire compose up command in the server.
Extra info: you cannot use your docker-compose.yml(which mine is inside of the azure repo in same project) because you are using ACR as source type, not the build itself or source code.
*in my humble opinion this is a good case to enrich a content. Because there is lack of docs and videos on internet.
thanks,

Use Running Azure Container Instance in Azure Pipeline

I'm currently using resource: container in azure pipeline to pull my custom image tools from ACR, and use this image to create a container than can run several CLI commands on my pipeline.
Pulling this custom image tools takes so much time, roughly around 5mins and I want to avoid that as it is considered as a wasted time and a blockage since I do debugging most of the time.
Question: Is it possible to create an Azure Container Instance that constantly run my custom image tools then call this container inside my pipeline to run some cli commands?
I'm having a hard time finding documentation, so I'm not really sure if it is possible.
You can try to set up a self-hosted agent in your custom Docker image, and when running the container on ACI, it will install the agent and connect it to your Azure DevOps.
Then, you can use this self-hosted agent to run your pipeline job which needs to run the CLI commands. Since the self-hosted agent is hosted in the container, the job runs on the self-hosted agent will run in the container.
About how to configure set up self-hosted agent in Docker image, you can reference the document "Run a self-hosted agent in Docker".

Is there any equivalent for GitLab CI/CD's environment file in GCP cloud build?

I'm using GitLab's CI/CD for building and deploying my application. As my deployment environment is GCP, I'm trying to use GCP cloud build to build and deploy my application to the cloud. In GitLab, I was using it's environment file feature to store certain critical files. Is there any alternative for that in GCP's cloud build?
I'm storing some configuration files and some other files as environment file. How can I completely replace GitLab CI/CD with GCP Cloud build?
You can in fact use Cloud Build for CI/CD and GitHub for Source Code Management as you usually do using GitLab, in that way you can build and deploy your apps on GCP and still be able to manage your source code.
In this guide of Similarities and differences between GitLab CI and Cloud Build, there is an explanation on how to build and deploy your code with Cloud Build. The steps in a few words would be:
Connect Cloud Build to your source repository
Select the repository where your source code is with authentication
Add a trigger to automate the builds using a Dockerfile or Cloud Build config file in YAML format
Also, a brief explanation of some important fields:
The steps field in the config file specifies a set of steps that you want Cloud Build to perform.
The id field sets a unique identifier for a build step.
The name field of a build step specifies a cloud builder, which is a container image running common tools. You use a builder in a build step to execute your tasks.
The args field of a build step takes a list of arguments and passes them to the builder referenced by the name field.
The entrypoint in a build step specifies an entrypoint if you don't want to use the default entrypoint of the builder.
To store config files, you can refer to Storing images in Container Registry:
You can configure Cloud Build to store your built image in one of the following ways:
using the images field, which stores the image in the Container Registry after your build completes.
using the docker push command, which stores the image in the Container Registry as part of your build flow.
If your build produces artifacts like binaries or tarballs, as mentioned in Storing build artifacts, you can store them in Cloud Storage.
In case you want to use variables, I would refer to the site mentioned in a previous comment.

Use terraform for google cloud run

Im looking into using terraform to automate setting up an environment for demos.
Works for VM instance and can be fully automated but management prefers to use Cloud run with Docker containers.
When I read this article it starts with manually having to build and register a docker container. I don't get that step, why can't that be automated as well with terraform?
Terraform is a deployment tool. More or less, it invokes API to build, update or delete things. So now what do you want to do? To take a container and to deploy it on Cloud Run. Build sources, uploading files, perform git clone aren't actions designed for Terraform.
It's not surprising to have a CI pipeline that build things and at the end a CD tool called for the deployment.

Run docker container on azure

I have a simple docker container which runs just fine on my local machine. I was hoping to find an easy checklist how I could publish and run my docker container on Azure, but couldn't find one. I only found https://docs.docker.com/docker-for-azure/, but this document kind of leaves me alone when it comes to actually copy my local docker container to Azure. Isn't that supposed to be very easy? Can anybody point me in the right direction how to do this?
But it is really easy.. once you know where to find the docs :-). I would take the azure docs as a starting point as there are multiple options when it comes to hosting containers in Azure:
If you're looking for this...
Simplify the deployment, management, and operations of Kubernetes -> Azure Container Service (AKS)
Easily run containers on Azure with a single command -> Container Instances
Store and manage container images across all types of Azure deployments
-> Container Registry
Develop microservices and orchestrate containers on Windows or Linux
-> Service Fabric
Deploy web applications on Linux using containers
-> App Service
Based on your info I would suggest storing the image using the Azure Container Registry and host the container using Azure Container Instances. No need for a VM to manage this way.
There is an excellent tutorial you could follow (I skipped the first 1 step since it involves creating a docker image, you already have one)
Another complete guide of pushing your image to azure and create a running container can be found here.
The good thing about Azure Container Instances is that you only pay for what you actually use. The Azure Container Registry is a private image repository hosted in Azure, if course you could also use Docker Hub but using ACR makes it all really simple.
In order to run an image, you simply need to configure a new VM with the Docker Daemon. I personally found Azure's documentation to be pretty complex. Assuming you are not trying to scale your service across instances, I would recommend using docker-machine rather than the Azure guide.
docker-machine is a CLI tool published by the Docker team which automatically installs the Docker Daemon (and all the dependencies) on a host. So all you would need to do is input your Azure subscription and it will automatically create a VM configured appropriately.
In terms of publishing the image, Azure is probably not the right solution. I would recommend one of two things:
Use Docker Hub, which serves as a free hosted Docker image repository. You can simply push images to Docker Hub (or even have them built directly from your Git repository).
Configure a CD tool, such as TravisCI or CircleCI, and use these to build your image and push directly to your deployment.
To run your docker image inside ACI, You can use of Azure Container Registry.
Step0: Create Azure Container Registry
Step1: Include a Dockerfile in your application code
Step2: Build the code along with the Dockerfile with a tag and create a
Docker image ( docker build -t imagename:tag .)
Step3: Push the Docker image to Azure container Registry with a image name and tag.
Step4: Now create a ACI, while creating, choose the image type as private, provide the image name, tag, image registry login server, image registry username, image registry password ( these details can be found under access keys tab inside Azure Container Registry)
Step5: choose running os as linux, in network step you can give an dns name for your ACI, then click on review & create
Step6: once ACI gets created you can go to overview and you can see fqdn, using fqdn you can access your application running inside Azure Container Instance.

Resources