Initial State:
A GitLab instance with many projects;
GitLab projects have pipelines that depend on many binaries, including Docker images
All these binaries and images exist in a local OSS instance of Nexus (NXRM3) in hosted or group repos of various formats (docker, maven, raw, and so on) or remote repos proxied by the repos in this Nexus.
Target State:
Content of the GitLab and Nexus is transferred into a GitLab in a way that allows the pipelines to run after trivial modifications, like changing URLs from Nexus to GitLab binary registries.
Challenge:
What approach to automate that could be more effective (less coding, testing, etc.) than a custom app/script that uses Nexus REST API to "walk" through Nexus, create a corresponding registry in GitLab (using the GitLab API), pull all artifacts from the Nexus repo and then push them into the created GitLab registry?
NOTE:
Properly mapping Nexus repo type to GitLab is a separate challenge.
Properly reproducing Nexus repo taxonomy in GitLab is a separate challenge.
Properly transfer Nexus repo permissions to GitLab is a separate challenge.
I would recommend creating a script to do the following:
Script that lists Docker images stored in a Docker repo of the aforementioned Nexus instance.
Script that pulls a docker image from the aforementioned Nexus repo of Docker format and pushes it into the Docker registry on a local host.
Script that pulls the aforementioned Docker image from the local Docker registry and pushes it into the aforementioned GitLab project Docker registry.
Related
I am using azure devops for CI/CD pipelines and one thing I couldn't find out, getting images from Azure container registry and run those images inside of my onpremise debian server. (I build and pushed the images to ACR already).
On release pipeline section:
I set Azure Container repository as source type and selected the latest artifact from there
then created a dev stage
Added a Command line tool to execute some scripts on server
-->This is the step where I lost<-- I need somehow to get these image from repository and fire compose up command in the server.
Extra info: you cannot use your docker-compose.yml(which mine is inside of the azure repo in same project) because you are using ACR as source type, not the build itself or source code.
*in my humble opinion this is a good case to enrich a content. Because there is lack of docs and videos on internet.
thanks,
Boss wants me to set up a pipeline in Azure Devops to our Gitlab repos. I have a few questions:
Do I set it up under "Git other"? Should I mirror the repositories into Azure Devops?
I am supposed to set it up with a docker image, do I need to use docker hub?
I've never set up a pipeline and I am just a lost intern, thanks for any advice.
Do I set it up under "Git other"?
Yes, you could use the Git other to create a Service connections for the
GitLab. And there is an extension GitLab Integration for Azure Pipelines,
which could be able to download the sources from a GitLab repository (using
clone command) and use downloaded sources in Azure Pipelines.
Should I mirror the repositories into Azure Devops?
If you have no plans to migrate gitlab repo to azure devops repo, you do not need to mirror the repositories into Azure Devops. Besides, just as LJ said, since the YAML structure does not support for Gitlab at this moment, we could not use YAML structure with Gitlab repo.
I am supposed to set it up with a docker image, do I need to use
docker hub?
This is a matter of taste. In addition to dikcer hub, you can also use Azure Container Registry.
Do I set it up under "Git other"? Should I mirror the repositories into Azure Devops?
If you want to set up the pipeline using the YAML structure and have all the features that Azure DevOps provides, you have to mirror the repository since it's not possible yet to use the YAML file to run pipelines directly from GitLab and Git other connection has some limitations.
I am supposed to set it up with a docker image, do I need to use docker hub?
For the pipeline environment, you can use VM Images provided by Azure.
I would like to know if the community edition of GitLab (Self hosted) has support for a CI process that would include:
Creation of docker containers (docker-compose or Dockerfile)
Save custom docker images to a docker Registry in GitLab
Integrate ansible To manage configuration items
I checked the pricing / feature compare section of their website but I wasn’t able to find this info. Wondering if any fellow stack overflowers that are using this framework already can help out.
Thanks
You can sign up for a free GitLab CE account and try it out. Here's what I found:
Docker Images and Registry
GitLab CE comes with a Package Registry (Maven, NuGet, PyPi, etc.) and Container Registry (for Docker images). However, I don't think GitLab will provide free storage for your packages. The benefit is full integration in your CI/CD pipeline. You'll need to host your own server, as described in How to Build Docker Images and Host a Docker Image Repository with GitLab. The official docs are at GitLab Container Registry.
Once you've setup your Docker images/registry, you can configure GitLab CE to run various jobs in Docker containers. It's as easy as specifying the name of the image in an image field in gitlab-ci.yml. The images will run on an application called a Runner on a VM or bare metal. You can setup your own Runners in AWS/GCP/Azure or even on your own laptop, but GitLab CE also provides 2000 free pipeline-minutes per month hosted on GCP. Instructions for setting up GitLab Runners can be found at https://docs.gitlab.com/runner/.
Ansible Integration
Once you get your docker images/registry and runners set up, you can store the Ansible binaries and dependencies in the docker images. You can execute the playbooks from the script section of a job defined in gitlab-ci.yml. Using Ansible and GitLab a Infrastructure-for-Code. You can read some great tutorials here and here.
When using the GitHub connection with Azure DevOps pipelines I see that you can specify a docker image in your azure-pipelines.yml file (vmImage: option)
But when using generic git connection you only have a dropdown with just some options:
Is there any way to add custom docker images to be used? Since I would need different NodeJS version for different pipelines and here, ubuntu 18.04 image for example only has v10
The option to choose a docker container it's only the YAML pipelines that are not supported in the other Git connection.
You can use a Self Hosted agent and run it in docker containers, so you need to configure a few images in your environment and connect them to Azure DevOps.
For my use case I need to use CodeCommit repositories. But I would also like to use GitLab GUI and features.
If I install GitLab on my server, is there a way to either connect it to CodeCommit repos directly (I just need to browse commits there) or set it as a mirror for CodeCommit so it would contain copies for all CodeCommit repos?
It should be possible to mirror your GitLab repository to an AWS CodeCommit repository. This Gitlab doc explains it. Basically, it helps you setup a CodeCommit repository to use as a replica, then you can set up a recurring job/codepipeline to act upon code changes.
The pre-requisite is to get an IAM user in the AWS account that holds the CodeCommit repo and will use credentials to pull the Gitlab repo into CodeCommit.
Yes, open new repository, go to settings -> General and you can find mirror function there. It will copy remote repository a few times per day.