Iām trying to configure an Azure logic apps with an Azure Devops connector that would create a new pipeline release
The pipeline needs two variables, system and scope:
And looking at the azure DevOps connector doc
https://learn.microsoft.com/en-us/connectors/visualstudioteamservices/#create-a-new-release
How should I configure the name and value in the logic app ?, I tried this but it didn't work:
How to pass those variable to the release pipeline in Azure DevOps ?
Apparently adding the connector from Visual studio will give an incomplete connector, deleting it and recreated it from the portal gave the key and value options
Related
I have 2 .NET 5 Azure Function Visual Studio projects . These 2 Functions to be deployed in a single Azure Function App.
I have 2 separate Build Pipelines for each of the projects and have separate Release Pipelines as well in Azure DevOps. I am not using YAML for release pipelines but depends on DevOps UI to choose a new Release pipeline.
My issue is that whenever running the Release pipeline, the last Release overrides the existing Function in Azure Function App. So, I am not getting 2 Functions as expected under a single Azure Function App. Always the last release deployment exists in Azure. Why one release deployment overrides existing Functions, even though they are from different Release pipelines?
In the MS documentation it states that if you perform a "Zip Deployment" to an Azure Function, it will always overwrite what is running on the Azure Function Service. I presume that the "Classic Pipeline" gives you the "Zip Deployment" option.
Link to MS Documentation: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-function-app?view=azure-devops#deployment-methods
For the moment, I do have a ticket open at Microsoft on how to manage different apps on the same Azure Function service. I only want changes made to the specific app where there are code changes without impacting any other existing apps on the Azure Function. For now, I only had one troubleshooting session with no solution in sight. I will share more information once I have received a solution/workaround.
Can we migrate published test results while migrating builds from one project to another in Azure DevOps? or is there any way to store those results.
Create a script using Azure DevOps Services REST API:
Get all Runs from old Azure DevOps: https://learn.microsoft.com/en-us/rest/api/azure/devops/test/runs/list?view=azure-devops-rest-6.0
Get all results for each run from old Azure DevOps: https://learn.microsoft.com/en-us/rest/api/azure/devops/test/results/list?view=azure-devops-rest-6.0
Create previously saved runs in new Azure DevOps: https://learn.microsoft.com/en-us/rest/api/azure/devops/test/runs/create?view=azure-devops-rest-6.0
Create previously saved results in new Azure DevOps: https://learn.microsoft.com/en-us/rest/api/azure/devops/test/results/add?view=azure-devops-rest-6.0
Published tests results will go into Tests Plan -> Runs after builds. You can go the Runs to check the result.
I have two projects in azure, need to trigger pipeline from pipeline in the another project. I installed the extension 'triggers azure devops pipeline' from marketplace.
Please help me with configuring this extension in the yaml file
[Please see the screenshot of my yaml file and extension]: https://i.stack.imgur.com/RGOJ4.png
The azure devops service connection dropdown of the 'triggers azure devops pipeline' extension is not displaying any value
According to your screenshot , it seems that you haven't created a Azure Devops API service connection.
You could navigate to Project Settings -> Service connections and create a Azure Devops API service connection.
Note: the Personal Access Token needs the following rights:
1.Triggering a Release: Release ā Read, write & execute ā Build Read & Execute
2.Triggering a Build: Build Read & Execute
Then you could get the Azure Devops Service Connection in the target task.
For more information, you could refer to the introduction of the extension.
Hope this helps.
What are the industrial standards for developing a CI/CD pipelines for Azure SQL database?
I have an existing Azure SQL database (DEV instance, includes Schemas, Tables, Functions, Stored Procedures, etc. ) the code for these are hardcoded (meaning, not generated using SSDT compare nor generating script from existing table/SP/Function nor DACPAC/BACPAC file, it's just the code which developers wrote) and maintained in Git repo,
Now, my users want to create another Database using the scripts which were uploaded into Git by developers (Use the code which was uploaded by developers in Git (bitbucket), meaning identifying all the dependencies of DB objects and executing them in order to create new Database, Is this the correct approach? consider this as approach 1),
upon investing lots of time on deployments, I am confused/convinced that it is advised to follow below approach, let's call it as approach 2,
create a solution and clone your existing Git repo in Visual Studio
Import the DB objects from solution explorer and push the solution to Git.
Create a pipeline includes steps as build solution/copy/publish artifact
Create a new release pipeline and use "Azure SQL Data Warehouse deployment" task and link DACPAC file (which is generated from above step dynamically)
Now, for incremental changes, my assumption is, Change the code-> upload in git->generate solution-> build release (the DACPAC file generated from build pipeline will be compared with current QA db and only new changes will be applied, behind the scenes, sqlpackage will be used to compare at release "Azure SQL Data Warehouse deployment task" )
Links I have gone thru:
Configure CD of Azure SQL database using Azure DevOps and Visual Studio
Please correct me if my understanding is wrong,
Thanks a ton,
A DevOps newbie here.
Azure DevOps services provide the Azure SQL database deployment task to deploy an Azure SQL database.
So the approach 2 is the common way. With the task we can deploy an Azure SQL Database using DACPAC or run scripts using SQLCMD.
You can also reference the following links:
Tutorial: Deploy your ASP.NET app and Azure SQL Database code by using Azure DevOps Starter
DevOps for Azure SQL
Azure SQL Database CI/CD with Azure DevOps
I have multiple pipelines in adf. How to monitor them using Program, to get the status of all the pipelines. The difficulty I am facing, I am not sure where to find the applicationID and authenticationKey for that DataFactory.
You will have to generate the application ID and application key by creating a new application in Azure active directory. Once created then you have to assign the role to that app.
you can go through following link to get generate the application id and application key: Create azure active directory application
For monitoring, there are various ways to monitor the pipeline by using the the powershell script, .NET SDK or Azure portal.
Please check the following link to monitor pipeline using .NET SDK:Monitor and manage ADF v1
Hope this helps,
Jai