Creating a single bitbucket-pipelines.yml file for multiple projects - bitbucket-pipelines

I have a large number of microservices (40+) with identical pipeline requirements (currently very simple: build, test, deploy). Each of them lives in its own repository. Obviously, I would like to avoid having to change my bitbucket-pipelines.yml file in 40 places as we improve our pipeline. I know travis and gitlab both offer an import/include feature that allows you to include a 'master' yml file. Is there anything similar for Bitbucket Pipelines? If not, what alternatives are viable?
Thanks!

You should be able to do that by using a custom Bitbucket Pipe. For example, you can build a custom pipe to deploy your code and add it to your bitbucket-pipelines.yml. Here is an example:
pipelines:
default:
- step:
name: "Deployment step"
script:
- pipe: your-bitbucket-account/deploy-pipe:1.0.0
variables:
KEY: 'value'
Here is a link to an article that explains how to write a custom pipe https://bitbucket.org/blog/practice-devops-with-custom-pipes-reusing-ci-cd-logic

This works for me, I had to setup bitbucket pipeline for a solution with multiple projects.
You will have to add custom: after pipeline: attribute before the name of the pipeline
pipelines:
custom:
deployAdminAPI:
- step:
name: Build and Push AdminAPI
script: enter your script
deployParentsAPI:
- step:
name: Build and Push ParentsAPI
script: enter your script
Here is the Link to the Solution on Bitbucket

Related

exclude project from GitLab CI/CD pipeline

I have two sub projects in my repo.
The First one is .Net 5, The second is SharePoint solution. Both projects located in one branch.
How to configure pipeline only for one project?
I need to pass SAST test only for .Net 5 project. Now both projects are testing.
In my .gitlab-ci.yml:
include:
- template: Security/SAST.gitlab-ci.yml
stages:
- test
sast:
stage: test
tags:
- docker
First off - putting multiple projects into a single repo is a bad idea.
Second, the Gitlab SAST template is meant to be "simple to use", but it's very complex under the hood and utilizes a number of different SAST tools.
You do have some configuration options available via variables, but those are mostly specific to the language/files you are scanning. For some languages there are variables available that can limit which directories/paths are being scanned.
Since you haven't provided enough information to make a specific recommendation, you can look up the variables you need for your project(s) in the official docs here: https://docs.gitlab.com/ee/user/application_security/sast/#available-cicd-variables
If each Project, can have specific branch names, you can use
only--except or rules keywords to determine when jobs should run.
sast:
stage: test
tags:
- docker
only:
- /^sast-.*$/
It is for specific jobs, not sure about whole pipeline.

Is there a way to translate a yaml build pipeline to a release pipeline using Azure Devops?

In my build pipeline I am running Pester tests and reporting to the output folder. During release I want to run these and only go to the next stage (deploy) if all tests pass. I am not concerned about the results, I just want to make sure that they all pass. I have tried to 'copy' my existing yaml file using the GUI however not sure of the best way to do this and variable scope. Is it possible to translate a build pipeline directly to an Azure release pipeline in Azure Devops and if so, what is the best approach for doing this?
I tried to translate it to json however i had an undefined error when I tried to import the json pipeline. I should also add that I can't see a solution by looking at the preview features for this.
I am afraid it is not possible to translate a build pipeline directly to an Azure release pipeline.
You need to manually add the same tasks which are in the build pipeline to the release stage. I believe it is not a complex work to create a stage with the same tasks in the release pipeline. You just need to add each tasks with the same configurations(copy and paste mostly). And create the same variables in Variables tab. Check document about classic CD release pipeline.
There is another workaround using multiple stages yaml pipeline instead of classic release pipepline.
With multiple stage yaml pipeline. You can just easily move your build yaml content in a stage. Check here for more information. See below:
trigger: none
stages:
- stage: Test
jobs:
- job: Pester tests
pool:
vmImage: windows-latest
steps:
..
- stage: Deploy
jobs:
- job: DeployJob
pool:
vmImage: windows-latest
steps:
..

Can the build policies on a pull request in Azure Devops use the yaml-files that is updated in said PR?

We have all our pipelines checked into code, but if we deliver a PR with changes in those pipelines, the PR Build Policies will run with the YAML-files in MASTER, not the ones that we want to check into master. It's basically a deadlock.
Say you want to remove a validation that makes all your PRs fail, so you make a PR but you can't merge it cause the build policies fail :P
PS: I know I can remove the policies, complete the merge, and add the policies back in as a manual job, but that is not really a good solution.
Create a separate yaml pipeline with the pre merge build steps that you then set in the PR policies for. It will always run the code from the current branch that the PR is created from.
We do it like this:
(All in same repo)
build_steps.yml - Yaml template with build steps
azure-pipelines-yml - Main pipeline that has a reference to build_steps.yml to build the project
pre_merge.yml - Secondary pipeline that is run only by PR request which has a reference to build_steps.yml so there are no differences in the build and two places to update if something changes.
Whole yaml definition:
#pre_merge.yml
trigger: none #Pipeline should never trigger on any branches. Only invoked by the policy.
variables:
- name: system.buildConfiguration
value: 'Release'
- name: system.buildPlatform
value: 'win10-x86'
- name: system.debug
value: 'false'
pool:
vmImage: 'windows-latest'
name: $(SourceBranchName)_$(date:yyyyMMdd)$(rev:.r)
steps:
- template: build_steps.yml
And then in the policies you setup it like this:
All this applies also to classic pipelines. You need to create a seperate pre-merge build pipeline that could reference a TaskGroup with the steps used in the main build pipeline. In both cases you dont have to use a template or Taskgroup and and create the steps manually. But if the build would change in the future you have 2 places to update.
Just throwing this out there as an alternative. However; if desired can maintain one yaml pipeline that can do both the pre merge and the deployment.
Would want to create a variable to detect if the pipeline is running against the "master/main" branch similar to:
IsMainBranch: $[eq(variables['Build.SourceBranch'], 'refs/heads/master')]
From there the variable will need to be a condition on subsequent jobs/stages. Personally I do not like this approach as it limits the portability of the pipelines. In addition it interjects another variable to account for; however, felt it fair to provide an alternative option.

How to use a single agent for multiple jobs/stages in azure devops YAML pipeline

I am working on Azure DevOps YAML pipeline, I am not sure whether we can use single agents through out the pipeline.
I have multiple jobs/stages - Build, Deploy, Post-Deploy, and want to assing that to a single agent, because it is consuming same artifacts.
Is there a way to assign a single agents through the pipeline.
Thanks in advance.
I dont want the agent to do checkout operation everytime for new job
Using the checkout keyword to configure or suppress this behavior.
steps:
- checkout: none
You can refer to this official document for details.
Yes, you can define a specific agent on YAML.
E.g:
pool:
name: AgentPoolName
demands:
- agent.name -equals AgentName
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/demands?view=azure-devops&tabs=yaml
If you cannot use a specif agent as #Savio Moreira suggested the only option that I found is to duplicate some steps.
In my case have a similar pipeline with 2 stage:
Build (To validate PR)
Visual Studio build
Visual Studio Test
VS test
Publish (To publish the Artifact after merge into master)
Visual Studio build
Copy file
Publish build artifacts
The Build part is trigger only when a PR is created using a condition in the YAML Stage and Enable Branch policies.
The Publish part is trigger only when there is merge into master.
It's a bit annoying that some steps need to be duplicated but the execution is unique, and with the same pipeline I can execute validation before merge and then create the artifact once the code is into master.
The checkout option doesn't help since in my case each stage is executed on a completely different container.
Yes there is a way all you need to do is to store the agent name of yo first job in a variable and then for any other job you need to ask for same agent. Its done like this -
jobs:
- job: A
steps:
- bash: |
echo "##vso[task.setvariable variable=AgentName;isoutput=true]$(Agent.Name)"
name: passOutput
- job: B
dependsOn: A
variables:
AgentName: $[ dependencies.A.outputs['passOutput.AgentName'] ]
pool:
name: pool1
demands: Agent.Name -equals $(AgentName)
steps:
- bash: |
echo $(AgentName)

In Gitlab CI, can you "pull" artifacts up from triggered jobs?

I have a job in my gitlab-ci.yml that triggers an external Pipeline that generates artifacts (in this case, badges).
I want to be able to get those artifacts and add them as artifacts to the bridge job (or some other job) on my project so that I can reference them.
My triggered job looks like this:
myjob:
stage: test
trigger:
project: other-group/other-repo
strategy: wait
I'd like something like this:
myjob:
stage: test
trigger:
project: other-group/other-repo
strategy: wait
artifacts:
# how do I get artifacts from the job(s) on other-repo?
badge.svg
Gitlab has an endpoint that can be used for the badge url for downloading the artifact from the latest Pipeline/Job for a project
https://gitlabserver/namespace/project/-/jobs/artifacts/master/raw/badge.svg?job=myjob
Is there a way to get the artifacts from the triggered job and add them to my project?
The artifacts block is for handling archiving artifacts from the current job. In order to get artifacts from a different job, you would handle that in the script section. Once you have that artifact, you can archive it normally within the artifacts block as usual.
You can use wget to download artifacts from a different project as described here
I know it a bit late but maybe this could help.
Add this to your job. It tells the job it needs the artifacts from a specific project.
(You need to be owner of the project)
needs:
- project: <FULL_PATH_TO_PROJECT> (without hosting website)
job: <JOB_NAME>
ref: <BRANCH>
artifacts: true

Resources