How to get Docker Images from Azure container registry and run onpremise servers using docker-compose? - azure

I am using azure devops for CI/CD pipelines and one thing I couldn't find out, getting images from Azure container registry and run those images inside of my onpremise debian server. (I build and pushed the images to ACR already).
On release pipeline section:
I set Azure Container repository as source type and selected the latest artifact from there
then created a dev stage
Added a Command line tool to execute some scripts on server
-->This is the step where I lost<-- I need somehow to get these image from repository and fire compose up command in the server.
Extra info: you cannot use your docker-compose.yml(which mine is inside of the azure repo in same project) because you are using ACR as source type, not the build itself or source code.
*in my humble opinion this is a good case to enrich a content. Because there is lack of docs and videos on internet.
thanks,

Related

How to create Azure pipeline CICD for docker container to linux on premises server?

I have a containerized app's image that builds on master push and ends up in azure container registry. I need to setup the next step where this image ends up on my linux on prem server.
I have established the connection with the server using a deployment group agent, which claims is healthy.
I have created a pipeline which takes the built image artifact as input but I am completely failing to grasp and create the step in which the artifact is being pulled on / pushed to the server (and ideally run too).
I am looking at the tasks in the pipelines > task section in Azure Devops but I cannot find a place to add some specific steps neither I am very sure what steps to add.
I would very much appreciate any tip on how to deploy a container from acr to on premises linux server using azure devops pipelines. Thank you in advance.
You can check this link which shows how to create Azure Pipeline CI/CD for docker container ,make sure all these steps are followed:
https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/docker?view=azure-devops

Is there any equivalent for GitLab CI/CD's environment file in GCP cloud build?

I'm using GitLab's CI/CD for building and deploying my application. As my deployment environment is GCP, I'm trying to use GCP cloud build to build and deploy my application to the cloud. In GitLab, I was using it's environment file feature to store certain critical files. Is there any alternative for that in GCP's cloud build?
I'm storing some configuration files and some other files as environment file. How can I completely replace GitLab CI/CD with GCP Cloud build?
You can in fact use Cloud Build for CI/CD and GitHub for Source Code Management as you usually do using GitLab, in that way you can build and deploy your apps on GCP and still be able to manage your source code.
In this guide of Similarities and differences between GitLab CI and Cloud Build, there is an explanation on how to build and deploy your code with Cloud Build. The steps in a few words would be:
Connect Cloud Build to your source repository
Select the repository where your source code is with authentication
Add a trigger to automate the builds using a Dockerfile or Cloud Build config file in YAML format
Also, a brief explanation of some important fields:
The steps field in the config file specifies a set of steps that you want Cloud Build to perform.
The id field sets a unique identifier for a build step.
The name field of a build step specifies a cloud builder, which is a container image running common tools. You use a builder in a build step to execute your tasks.
The args field of a build step takes a list of arguments and passes them to the builder referenced by the name field.
The entrypoint in a build step specifies an entrypoint if you don't want to use the default entrypoint of the builder.
To store config files, you can refer to Storing images in Container Registry:
You can configure Cloud Build to store your built image in one of the following ways:
using the images field, which stores the image in the Container Registry after your build completes.
using the docker push command, which stores the image in the Container Registry as part of your build flow.
If your build produces artifacts like binaries or tarballs, as mentioned in Storing build artifacts, you can store them in Cloud Storage.
In case you want to use variables, I would refer to the site mentioned in a previous comment.

Deploy ShinyProxy and an application with Azure DevOps

I'm trying to deploy a ShinyProxy in a Docker container to Azure. Also, I have the ShinyApp repositories in Azure DevOps. Every time a developer updates the code in a repository, I want to the CD/CI deploy the new code creating a Docker container to Azure.
Also, I think, I have to create an internal Docker network between the ShinyProxy and the app.
How can I create this process? Is there any tutorial how to setup a pipeline in Azure DevOps and run a ShinyProxy on Azure?
After a lot of research and study, I found how to create a full end-to-end deployment. I have created a complete step-by-step guide in my blog.
I hope it could help other guys.
Do you want to deploy your app to Azure Container Registry?
If so, below are the main steps you need to do:
Create a container registry on the Azure Portal.
Create a Docker registry service connection on the project settings to enable your pipeline to push images into the container registry.
Create the pipeline that gets source from your repository.
Add the Docker task in the pipeline,
select the Docker registry service connection created in above step as the 'Container registry'.
select 'buildAndPush' as the 'command'.
To view more details, you can reference to "Build and push to Azure Container Registry".

GitLab community edition - docker support

I would like to know if the community edition of GitLab (Self hosted) has support for a CI process that would include:
Creation of docker containers (docker-compose or Dockerfile)
Save custom docker images to a docker Registry in GitLab
Integrate ansible To manage configuration items
I checked the pricing / feature compare section of their website but I wasn’t able to find this info. Wondering if any fellow stack overflowers that are using this framework already can help out.
Thanks
You can sign up for a free GitLab CE account and try it out. Here's what I found:
Docker Images and Registry
GitLab CE comes with a Package Registry (Maven, NuGet, PyPi, etc.) and Container Registry (for Docker images). However, I don't think GitLab will provide free storage for your packages. The benefit is full integration in your CI/CD pipeline. You'll need to host your own server, as described in How to Build Docker Images and Host a Docker Image Repository with GitLab. The official docs are at GitLab Container Registry.
Once you've setup your Docker images/registry, you can configure GitLab CE to run various jobs in Docker containers. It's as easy as specifying the name of the image in an image field in gitlab-ci.yml. The images will run on an application called a Runner on a VM or bare metal. You can setup your own Runners in AWS/GCP/Azure or even on your own laptop, but GitLab CE also provides 2000 free pipeline-minutes per month hosted on GCP. Instructions for setting up GitLab Runners can be found at https://docs.gitlab.com/runner/.
Ansible Integration
Once you get your docker images/registry and runners set up, you can store the Ansible binaries and dependencies in the docker images. You can execute the playbooks from the script section of a job defined in gitlab-ci.yml. Using Ansible and GitLab a Infrastructure-for-Code. You can read some great tutorials here and here.

Run docker container on azure

I have a simple docker container which runs just fine on my local machine. I was hoping to find an easy checklist how I could publish and run my docker container on Azure, but couldn't find one. I only found https://docs.docker.com/docker-for-azure/, but this document kind of leaves me alone when it comes to actually copy my local docker container to Azure. Isn't that supposed to be very easy? Can anybody point me in the right direction how to do this?
But it is really easy.. once you know where to find the docs :-). I would take the azure docs as a starting point as there are multiple options when it comes to hosting containers in Azure:
If you're looking for this...
Simplify the deployment, management, and operations of Kubernetes -> Azure Container Service (AKS)
Easily run containers on Azure with a single command -> Container Instances
Store and manage container images across all types of Azure deployments
-> Container Registry
Develop microservices and orchestrate containers on Windows or Linux
-> Service Fabric
Deploy web applications on Linux using containers
-> App Service
Based on your info I would suggest storing the image using the Azure Container Registry and host the container using Azure Container Instances. No need for a VM to manage this way.
There is an excellent tutorial you could follow (I skipped the first 1 step since it involves creating a docker image, you already have one)
Another complete guide of pushing your image to azure and create a running container can be found here.
The good thing about Azure Container Instances is that you only pay for what you actually use. The Azure Container Registry is a private image repository hosted in Azure, if course you could also use Docker Hub but using ACR makes it all really simple.
In order to run an image, you simply need to configure a new VM with the Docker Daemon. I personally found Azure's documentation to be pretty complex. Assuming you are not trying to scale your service across instances, I would recommend using docker-machine rather than the Azure guide.
docker-machine is a CLI tool published by the Docker team which automatically installs the Docker Daemon (and all the dependencies) on a host. So all you would need to do is input your Azure subscription and it will automatically create a VM configured appropriately.
In terms of publishing the image, Azure is probably not the right solution. I would recommend one of two things:
Use Docker Hub, which serves as a free hosted Docker image repository. You can simply push images to Docker Hub (or even have them built directly from your Git repository).
Configure a CD tool, such as TravisCI or CircleCI, and use these to build your image and push directly to your deployment.
To run your docker image inside ACI, You can use of Azure Container Registry.
Step0: Create Azure Container Registry
Step1: Include a Dockerfile in your application code
Step2: Build the code along with the Dockerfile with a tag and create a
Docker image ( docker build -t imagename:tag .)
Step3: Push the Docker image to Azure container Registry with a image name and tag.
Step4: Now create a ACI, while creating, choose the image type as private, provide the image name, tag, image registry login server, image registry username, image registry password ( these details can be found under access keys tab inside Azure Container Registry)
Step5: choose running os as linux, in network step you can give an dns name for your ACI, then click on review & create
Step6: once ACI gets created you can go to overview and you can see fqdn, using fqdn you can access your application running inside Azure Container Instance.

Resources