I would like to know if the community edition of GitLab (Self hosted) has support for a CI process that would include:
Creation of docker containers (docker-compose or Dockerfile)
Save custom docker images to a docker Registry in GitLab
Integrate ansible To manage configuration items
I checked the pricing / feature compare section of their website but I wasn’t able to find this info. Wondering if any fellow stack overflowers that are using this framework already can help out.
Thanks
You can sign up for a free GitLab CE account and try it out. Here's what I found:
Docker Images and Registry
GitLab CE comes with a Package Registry (Maven, NuGet, PyPi, etc.) and Container Registry (for Docker images). However, I don't think GitLab will provide free storage for your packages. The benefit is full integration in your CI/CD pipeline. You'll need to host your own server, as described in How to Build Docker Images and Host a Docker Image Repository with GitLab. The official docs are at GitLab Container Registry.
Once you've setup your Docker images/registry, you can configure GitLab CE to run various jobs in Docker containers. It's as easy as specifying the name of the image in an image field in gitlab-ci.yml. The images will run on an application called a Runner on a VM or bare metal. You can setup your own Runners in AWS/GCP/Azure or even on your own laptop, but GitLab CE also provides 2000 free pipeline-minutes per month hosted on GCP. Instructions for setting up GitLab Runners can be found at https://docs.gitlab.com/runner/.
Ansible Integration
Once you get your docker images/registry and runners set up, you can store the Ansible binaries and dependencies in the docker images. You can execute the playbooks from the script section of a job defined in gitlab-ci.yml. Using Ansible and GitLab a Infrastructure-for-Code. You can read some great tutorials here and here.
Related
I am using azure devops for CI/CD pipelines and one thing I couldn't find out, getting images from Azure container registry and run those images inside of my onpremise debian server. (I build and pushed the images to ACR already).
On release pipeline section:
I set Azure Container repository as source type and selected the latest artifact from there
then created a dev stage
Added a Command line tool to execute some scripts on server
-->This is the step where I lost<-- I need somehow to get these image from repository and fire compose up command in the server.
Extra info: you cannot use your docker-compose.yml(which mine is inside of the azure repo in same project) because you are using ACR as source type, not the build itself or source code.
*in my humble opinion this is a good case to enrich a content. Because there is lack of docs and videos on internet.
thanks,
Initial State:
A GitLab instance with many projects;
GitLab projects have pipelines that depend on many binaries, including Docker images
All these binaries and images exist in a local OSS instance of Nexus (NXRM3) in hosted or group repos of various formats (docker, maven, raw, and so on) or remote repos proxied by the repos in this Nexus.
Target State:
Content of the GitLab and Nexus is transferred into a GitLab in a way that allows the pipelines to run after trivial modifications, like changing URLs from Nexus to GitLab binary registries.
Challenge:
What approach to automate that could be more effective (less coding, testing, etc.) than a custom app/script that uses Nexus REST API to "walk" through Nexus, create a corresponding registry in GitLab (using the GitLab API), pull all artifacts from the Nexus repo and then push them into the created GitLab registry?
NOTE:
Properly mapping Nexus repo type to GitLab is a separate challenge.
Properly reproducing Nexus repo taxonomy in GitLab is a separate challenge.
Properly transfer Nexus repo permissions to GitLab is a separate challenge.
I would recommend creating a script to do the following:
Script that lists Docker images stored in a Docker repo of the aforementioned Nexus instance.
Script that pulls a docker image from the aforementioned Nexus repo of Docker format and pushes it into the Docker registry on a local host.
Script that pulls the aforementioned Docker image from the local Docker registry and pushes it into the aforementioned GitLab project Docker registry.
When using the GitHub connection with Azure DevOps pipelines I see that you can specify a docker image in your azure-pipelines.yml file (vmImage: option)
But when using generic git connection you only have a dropdown with just some options:
Is there any way to add custom docker images to be used? Since I would need different NodeJS version for different pipelines and here, ubuntu 18.04 image for example only has v10
The option to choose a docker container it's only the YAML pipelines that are not supported in the other Git connection.
You can use a Self Hosted agent and run it in docker containers, so you need to configure a few images in your environment and connect them to Azure DevOps.
These pics show my process of deployments on Azure Docker with Eclipse IDE.
The problem is I have no idea how to modify Wildfly Docker configuration. For example, Datasource JNDI and undertow setting of Wildfly AS in Docker container.
You can configure the Docker container as follows:
Predefined Docker image: Specifies a pre-existing Docker image from Azure.
[!NOTE] The list of Docker images in this box consists of several images that the Azure Toolkit has been configured to patch so that your artifact is deployed automatically.
Custom Dockerfile: Specifies a previously saved Dockerfile from your local computer.
[!NOTE] This is a more advanced feature for developers who want to deploy their own Dockerfile. However, it is up to developers who use this option to ensure that their Dockerfile is built correctly. The Azure Toolkit does not validate the content in a custom Dockerfile, so the deployment might fail if the Dockerfile has issues. In addition, the Azure Toolkit expects the custom Dockerfile to contain a web app artifact, and it will attempt to open an HTTP connection. If developers publish a different type of artifact, they may receive innocuous errors after deployment.
For more details, refer https://github.com/Azure/azure-docs-sdk-java/blob/master/docs-ref-conceptual/eclipse/azure-toolkit-for-eclipse-publish-as-docker-container.md.
I have a GitLab repo, which uses docker container registry.
I want to push some of built containers to production using only one button click.
I heard that Kubernetes and Docker Swarm can help me, but I don't understand what it is
Can you explain in human language what is docker swarm and how can I use it to solve my task?
I want to push some of built containers to production using only one button click
You don't need docker orchestration for that, just docker.
More specifically, docker image push (previously docker push)
see "GitLab Using Docker Build"
Container orchestration is what is needed to transition from deploying containers individually on a single host, to deploying complex multi-container apps on many machines.
That is not what you are trying to do here: you just need to push images to a registry.