GitLab CI - Move pipeline logic from a project repo to centralized "devops-repo" - gitlab

I have a great experience of pipeline creating automation (in case of huge amount of repos).
For example, a project has 20 similar repos with Java app (like a microservice) and a pipeline for each of them is differing only by repo url (and a few more minor attributes). The CI/CD process for each of them is the same.
So, we can create a separated devops-repo with declaration configuration for our services. Also we can create a single pipeline which will pull the devops repo and create all needed pipelines for each repo in the configuration (this operation is going to be executed only once in the beginning and in case if we want to change the devops-configuration)
I have implemented that using Jenkins. Now, I am going to do so using GitLab CI. But I can't get how is it possible.
Is it possible to create a pipeline from another one (dynamically)?
Any suggestions?

You can use include and put the generic pipeline in your devops repo.
In your java repos you can include the devops pipeline and set the variables which are specific for the respective java repo.
So the pipeline for your java repos can be as short as this:
include:
- project: 'your-group/devops-repository'
file: '.generic-ci.yml'
variables:
FOO: bar

Related

Looking at modifying a pipeline build

I am working for a company that develops a lot of apps and we use Azure devops portal for all of our pipeline releases etc.
So, the general flow is that a developer will create a branch to do the development on and change the code on that branch. They would like to deploy that branch before it is merged back into the development line from a pipeline, rather than from visual studio, which they currently do. So, it's about giving the developer to the option to choose which branch to deploy from.
Has anyone done something similar or point me in the right direction to how I could go about this?
When creating the release, currently there is no build-in feature to choose which branch to deploy from, the release pipeline works with the corresponding source Artifacts configuration.
A good approach for your scenario is referring to the build id on your build pipeline (instead of release pipeline).
When running the build pipeline, you could choose the target branch and record the build id for this run.
In your release pipeline, you could check whether the source Artifacts is from your target branch by checking the build id.

Can I run Azure DevOps pipeline without committing it?

I am planning to experiment building a pipeline using Azure DevOps. One thing that I noticed early on is, after azure-pipelines.yml created, I have to commit this first before being able to run it. But I want to experiment on it which revolves around trial and error. Doing multiple commit just to test things out are not feasible.
In Jenkins I can just define my steps and try to run it without committing the file.
Is this also possible to do in Azure DevOps?
But I want to experiment on it which revolves around trial and error. Doing multiple commit just to test things out are not feasible.
Yes it is - you just use a different code branch. That will allow you the freedom to make as many changes as you need, while putting the pipeline together and trying it out, without committing to the master branch.
Then when you're happy with the way the pipeline is running, you can merge your branch into the master branch which the pipeline normally uses.
You cannot run YAML pipelines without committing them, but you can create classic pipelines and run them without committing anything pipeline-related to the repository (except for the source code you want to build). Classic pipelines can later be turned (or copy-pasted, to be exact) into yaml pipelines with view YAML -option.
https://learn.microsoft.com/en-us/azure/devops/pipelines/get-started/pipelines-get-started?view=azure-devops#define-pipelines-using-the-classic-interface
If you're on your own branch, or in a repository without any other developers making changes then you can
Make a change
use git commit --amend to overwrite your previous commit with the new file
use git push --force-with-lease to push that up to Azure DevOps
That will hide your commit history while experimenting

How to create a common pipeline in GitLab for many similar projects

We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages

Azure functions - deploy by project in single repo?

We are building a set of serverless functions in Azure, but having difficulty deciding how to structure our source (Azure GIT) and DevOps to support them.
I am thinking of a single GIT repo, with all function apps housed independently within projects. We may have a lot of these function apps, we see great value in small code segments to do utility type of work, and I don't want dozens and dozens of independent repos just because of DevOps deployments. Is there a way to have a unique build and release process for each project, not the repo entirely? We aren't clear how this can be done and searches have come up empty on this. I thought it was possible to have unique build YAMLs per project across many projects in a single repo - but unclear how to implement the DevOps build and release pipleines to support this approach - ie; only a single function gets updated and we need to deploy - any guidance if this is possible and how to approach it would be great.
I haven't done this myself, but I'm in a similar situation where I'd like to have multiple functions (and other stuff) in a single Git repo for simplicity, but only build/deploy them as needed when they change. It looks like you can have multiple pipelines on a single repo with a different YAML file for each pipeline. The steps are documented in this link, and summarized below
In Azure DevOps, create a new Pipeline.
For the "Where is your code?" page, at the bottom choose the Use the classic editor option.
Select your source repo and branch.
On the "Select a template" screen, choose the YAML option at the top. Hit Apply.
There is a YAML file path field where you can specify the path and name of your YAML file for the pipeline.
You may want to set the pipeline to run manually if you don't want a build each time there's a commit to the repo.
EDIT There may be an easier way to do this now. If you go through the New Pipeline wizard, select your source location, on the Configure tab, at the bottom you can choose the Existing Azure Pipelines YAML file option. This lets you select a custom YAML file directly.

Is it possible to script the flow/stages/steps in Azure Pipelines?

I'm trying to setup Azure Pipelines for a CI setup and I'm using the YAML syntax to get started. However, I was wondering if it is possible to script the flow at "runtime"? Like you can do in Jenkins script: spawn builds etc.
Depending on the commit I want to have a vastly different flow.
This is because I currently have a mono-repo setup with Conan libraries and I want to rebuild the libraries that are necessary depending on the commit, thus the build-flow is not the same for each commit. I want to spawn jobs so I can take advantage of parallel building on several agents.
For your issue ,do you refer to trigger builds based on specified commits? If so, you can trigger builds by adding tag trigger in yaml. You can create tags on the commits. If the tag created meets the trigger condition of the tag trigger in yaml , then the build will be triggered.
trigger:
tags:
include:
- v2.*

Resources