Azure Pipelines conditions - azure

I have setup a flow where azure release definition makes to an agentless API call to azure build pipeline which performs a list of tasks which trigger a release.
I would like to add a condition to the azure build pipeline to differentiate between a user running the build pipeline manually (through portal) or if the pipeline was triggered via an api call.
What is the neatest way to do this? Ideally I expect a condition something like
eq(triggered-by, "Joe") -> not ideal, I don't want to attach condition based on a users name
eq(build-reason, "api") -> ideal but is there some in-build condition for something like this?
One other options which passes through my mind is passing a custom runtime variable through the api call, but I was wondering if there was a more in-built approach.
Thanks in advance.

Currently, there's no build-in feature in Azure DevOps to achieve this.
We cannot judge whether it is triggered by manually or API. If you are using an API call, the token also stands for a single user.
Also, you could raise a feedback ticket for your demand: Suggest a feature - Visual Studio (Windows) | Microsoft Docs

Related

Using LogicApp to make post in Teams with link to open pipeline

I know it's possible to create Logic App in Azure to send message in Teams after completing pipeline. I don't know how can I add hash with completed pipeline as clickable link to open.
I couldn't find any information how can I do this.
I will be grateful for your help.
Azure DevOps Pipelines already have official support to integrate into Microsoft Teams that you could use directly, even without a Logic App.
But if you still need to use a Logic App, then what you could do is use the Invoke a REST API task to call your Logic App, and then send an Adaptive Card which you can design to have a button which links to the pipline.

Azure DevOps REST API, how to allow Azure Function to create workitems in DevOps project?

I'm creating a schedule-triggered Azure Function which will run tests once a day. If during the run any test fails, I want it to create a bug on a Azure DevOps project which includes a log of the failed tests.
I know I could create a PAT so that it can authenticates with the DevOps REST API but I don't like its downsides:
it can be valid for 1 year at maximum, I will need to remember to extend its expiration period
every bug created like this will have me as its creator
Edit:
I found out I could use a MS Flow - there's a DevOps connector that can create workitems, still it has a downside of having me as workitem's creator but it's not such a pain...
Still would much appreciate to learn about other options...
Is there any better way I can let my Azure Function to create bugs on my DevOps project?
An alternate option would be the use azure logic app along with azure function.
Here the azure function would directly call a azure logic app and the logic app would create a bug workitem .
Refer the following article by Stefan stranger
On how to create logic app which will create the bug. Here we will be using azure webhooks for the creating of bugs.
Now you can either send the data to a storage account where the logic app using some trigger to get the data and then use it to create bug, or you can directly connect to the logic app .
Refer the following article by Laura KokKarinen for this .

How to Automate Deployment of Azure Database for PostgreSQL using Azure DevOps

Can someone pls help me to automate build and release pipelines in Azure DevOps for Azure DB for PostgreSQL database (Single Server) so that I can create a Database and run different scripts in that database for creating/altering tables, functions, indexes etc. ?
I googled and found nothing in Microsoft documentation for this purpose but I did find it can be done using Zapier
As per organizational policies I can not use Zapier or any third-party tools/sites.
Is it doable only using Microsoft build and release tasks in Azure DevOps, can someone pls guide me with any steps for this purpose ?
Database DevOps is difficult because you have to manipulate existing objects, not simply replace them like you do for application deployments. To do this, you have to add a tool that manages your Data Definition Language queries. Or you can build one. We did that a long time ago. I don't recommend it. Tons of work, lots of issues.
For PostgreSQL, I'd suggest you start testing Flyway. It works really well with Azure DevOps. I have a short video you can use to see it in action. Flyway is open source, so getting started with it is license free. You can install the software, but it also runs through containers, so it makes it really simple to implement through the Azure DevOps agents. The concept is pretty simple. It acts as a marshalling tool to run your DDL in the correct order, like a manifest. Then, it marks the database so it knows which scripts it has already run. You go from there.

Is it possible to define parameters through portal for Logic App (Standard)?

I have a single-tenant logic app and a workflow under it that needs a configurable input. In a multi-tenant logic app, one can define parameters through the azure portal and reference them in workflow definition (actions/ triggers). Is this not possible with a single-tenant logic app?
I am not able to find the answer in the documentation.
I know a deployment template should consult parameters file for this, however, I still have the above question specifically if I am doing stuff through the portal.
Edit 7/12
I am referring to the parameters concept explained here, and not the parameters tab of the triggers or actions. See below the parameters that we can define through the portal when working with the consumption logic app.
The answer is: not yet. Support for parameters in the designer (and therefore in the Azure Portal) is on its way, but not available yet.
In VS Code, you can create a parameters.json file.
But in the portal, there's no option (yet) to create/edit parameters.
Bec Lyons (Microsoft) demoed a version of the designer with this in it, although I can't remember if this was in the June Logic Apps Live session, or in the July Integration Down Under session.
In any case, the only currently supported way to do this is to create a parameters.json file and upload it.
You can either do this from VS Code or Azure CLI (using the preview logicapps CLI extension) OR you can FTP to your Logic App and upload it via an FTP client (e.g. FileZilla) - you can get the FTP login details by clicking the "Get Publish Profile" button in the overview of your Logic Apps Standard resource.
Once they release support for this in the Portal/Designer, I'll update this answer.
Also, worth noting that as of this date (July 2021), there are issues using parameters in Managed API Triggers - not sure yet if this is by design, or if it's a bug. Specifically the FileSystem, FTP and FTPWithSSH (SFTP) triggers.
Hope this helps. Probably not the answer you were looking for, though!

Retrieve Build Cause/Reason from Azure Pipelines?

I am using the Azure DevOps REST API, for example:
curl -X GET 'https://dev.azure.com/MyOrg/MyProject/_apis/pipelines/ID/runs/ID?api-version=6.0-preview.1'
In the result, there does not seem to be any mention of the parameters passed into the pipeline, or the build cause/reason. I've tried setting variables to the value of the parameters and that doesn't seem to show up either, and fiddling with the API version string hasn't yielded anything.
Is there a way to programmatically retrieve information about trigger reason and parameters using the API? Without this, it seems impossible to build any backend data (for example, to show the % of users using your pipeline via the azure portal or via PR triggers).
You could use Builds - Get api instead:
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}?api-version=6.0
You would get triggerInfo and reason in the response.

Resources