Gitlab issue close automatically - gitlab

I wonder if there is a way to close a issue automatically at a certain time like every Friday at 18:00 if that issue has a label or something like that.

GitLab did not include such a feature.
They use their own bot to triage issues and merge requests.

This isn't a feature of GitLab itself. However, you could run a scheduled pipeline that uses the issues API to do this.
To make sure the scheduled pipeline has the properly scoped API access, you can generate a project access token and place it in the CI/CD variables.
The scheduled pipeline does not even necessarily have to be configured in the same project in which you want issues to be expired, if you're concerned about it triggering existing pipeline jobs. For example, you can create a new project called "issue cleanup" and setup the pipeline there to cleanup issues of one or more other projects on the schedule

Related

Azure devops artefact retention

I’ve got a mono repo, which has 10 separate CICD pipelines written in yaml.
I’ve noticed lately that we’ve lost a vast number of runs, and some of them had successful production releases.
Am I right in thinking that the project rententiob settings applies to all pipelines? Rather than individual?
I’ve been reading on the ms website and I think in order to retain them going forward, I have to use the API via a powershell script.
I assume the said script needs to run after a successful deployment to production.
I’m quite surprised that there isn’t a global option to say ‘keep all production releases’
The project Retention policy settings will be applied to all pipeline runs not individual. So you could not use this setting to retention specific successful production releases directly.
To achieve this, you could use the PowerShell script to retention these specific runs with "Condition". Add the PowerShell script as the last task of your deployment to check if this one needs to be retained. Refer to this official doc: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/run-retention?view=azure-devops
Here is an example to retention forever based on condition:
- powershell: |
   $contentType = "application/json";
   $headers = #{ Authorization = 'Bearer $(System.AccessToken)' };
   $rawRequest = #{ daysValid = 365000 ; definitionId = $(System.DefinitionId); ownerId = 'User:$(Build.RequestedForId)'; protectPipeline = $false; runId = $(Build.BuildId) };
   $request = ConvertTo-Json #($rawRequest);
   $uri = "$(System.CollectionUri)$(System.TeamProject)/_apis/build/retention/leases?api-version=6.0-preview.1";
   Invoke-RestMethod -uri $uri -method POST -Headers $headers -ContentType $contentType -Body $request;
  displayName: 'PowerShell Script'
  condition: {Your customize condition}

Azure Pipelines best practices for multiple services and environments

I was hoping to get some feedback on using Azure Pipelines and what the best practices are for my situation.
We have recently migrated from TFS 2017 and we are in the process of re-writing all our pipelines. We were using builds and releases prior to the upgrade in the legacy build tasks. We would like to setup more useful YAML pipelines.
Let me set the stage with what we currently have
10+ microservices
10 individual builds that trigger from a folder in the repo for each one
10 releases that get created on successful build
3 environments per release (Dev, QA, UAT)
So in summary... a build of a single microservice triggers off of a commit to a folder in the branch. The successful build then triggers a release to Dev. Dev completes and a user go go start a QA build by clicking the release.
In the new Azure Pipeline world... what would be the best approach to doing this model.
We would like to have all the builds happen in a single pipeline (each stage would be a microservice?)
How do we trigger only on a commit to that folder?
What would the CD look like? Should it be in the same pipeline and be a new stage?
How can we easily just add environments without having to just keep copy/pasting all the code for each environment? Ideally i would like to just be able to add a variable and a new environment can be deployed to
I am open to any suggestions here. I am ok if I am way off here, I am looking for the best practices and best approach to this.
TIA

Azure Pipelines DevOps Not Being Triggered by PR

I've been using Azure Pipelines for a while now and haven't changed my azure-pipelines.yml file here in 2 months. Previously, when there was a new PR, the pipeline would trigger and cause the environment to be built and the tests would be run.
Today, there was a new PR but I noticed that the pipeline was not being triggered. Then, to further test this, I forked, cloned, and branched the repository myself and created another new PR and, again, the pipeline was not triggered.
It's not clear to me where things are getting stuck and it's not clear how one would debug this. I've gone through this Azure DevOps documentation but it wasn't useful. I can manually trigger the pipeline to execute and test the master branch but I don't know how to manually trigger the same thing for a PR. Here's my Azure DevOps page for reference.
As normal, you do not need config pr in YAML script if there's no any special demand, we would do pull request trigger for all branches. But, it start broken from 03-13 21:02 (UTC), which caused by us, you do not do anything wrong.
The fix is preparing with our best.
As Alex said, this is the implicit trigger which YAML support only, if you do not configure pr in YAML explicitly.
To avoid such stuck later, except the method that Alex mentioned: add pr into YAML. You can also make use of UI configuration which performance is very stable until now.
Just go Pipeline definition page => Click on three dots of right corner => Select Trigger:
Then you will see Triggers tab which has Continues integration and Pull request validation display below. Open Pull request validation and enable Override the YAML pull request trigger from here:
Additional, Our team has noticed this broken issue, will update whether it is fixed here once we have any fixed release in progress.
Update 3/18/2020:
Fixed has released to all region. Every one can work github pr trigger as the document shows now.

In GitLab is it possible to configure a Scheduled Pipeline that runs on all branches periodically?

I am using GitLab for Git version control and GitLab CI / CD for my automated builds. Usually, the builds are triggered by Git repository activity but I also have a weekly build to ensure that projects not under active development continue to work. When there is only a "master" branch on a project, it is easy to ensure a weekly build is run on the latest code. When there are multiple branches in a project, I would like to repeat the pipeline work for each of them in turn.
What I would like to be able to do is schedule a build (weekly, fortnightly or monthly) that runs on all current branches visible in Git. Is that possible within GitLab's Continuous Delivery system?
The motivation behind doing this is to ensure that external activity, such as tool and library updates, do not introduce an issue without it being promptly visible. Assuming there are reasonable automated testing, coverage and comprehensive builds for target platforms, a monthly build with the latest tools should highlight the problem promptly. This is better than an invisible mountain to problems accumulating while a project is shelved for a few years (or months). Sometimes all that is required is occasional maintenance.
There are only a handful of feature branches and release lines on the projects currently. I would not expect that number to grow significantly. There is time enough over a weekend to run the required pipelines dozens if not hundreds of times at present.
Ideally, I would like something straightforward to set up. I cannot see anything in the admin GUI that would allow this at present. I did look at the API and I can see there is some scope there to script the addition and removal. Perhaps some script that is run once a month to create new Scheduled pipelines based on git branches is the only way. A pre-made solution on those lines would be perfectly acceptable. If nothing exists I might start work on something like that in time.
I am currently running GitLab Community Edition 11.2.3 06cbee3 (GitLab CE 11.2.3). If there is an Enterprise Edition only answer, that is fine and will add to the justifications of purchasing the EE version. I would pick at CE one above the EE one though.
You cannot set a schedule for all branches at once, you have to configure one schedule per branch yourself.
Perhaps some script that is run once a month to create new Scheduled
pipelines based on git branches is the only way.
I would go in that way.

Change webjob from on demand to schedule

We have an azure webjob that was deployed as on demand. Is there a way to change this to run on a schedule without redeploying?
Not a lot of info out there on this.
I tried creating a new schedule collection like this and adding a job to run the existing webjob, but that didn't seem to work either.
I prefer to do this in the GUI portal, but if its not possible, I'll do it in powershell (if it is possible like that).
(Also, if it can only be changed by redeploying, I need to know that and it effectively answers the question)
To easily add a schedule to your triggered (on demand) webjob add a file called settings.job at the root of your webjob with this content:
{"schedule": "the schedule as a cron expression"}
Find out more about this here
Note: it'll only work properly for Standard or Premium sites and requires you to set the site as always on.
In the end the link in the original post I referenced worked. The thing that was missing for me was the understanding that creating a trigger job in scheduler will not affect the run status (on-demand or scheduled) of the web job itself. In my case it stayed "on-demand", but the schedule was in fact running it.
This should point you in the direction on how to do this via PowerShell. It looks possible to add already existing WebJobs to a scheduler.
Create a Scheduled Azure WebJob with PowerShell

Resources