What are the industrial standards for developing a CI/CD pipelines for Azure SQL database?
I have an existing Azure SQL database (DEV instance, includes Schemas, Tables, Functions, Stored Procedures, etc. ) the code for these are hardcoded (meaning, not generated using SSDT compare nor generating script from existing table/SP/Function nor DACPAC/BACPAC file, it's just the code which developers wrote) and maintained in Git repo,
Now, my users want to create another Database using the scripts which were uploaded into Git by developers (Use the code which was uploaded by developers in Git (bitbucket), meaning identifying all the dependencies of DB objects and executing them in order to create new Database, Is this the correct approach? consider this as approach 1),
upon investing lots of time on deployments, I am confused/convinced that it is advised to follow below approach, let's call it as approach 2,
create a solution and clone your existing Git repo in Visual Studio
Import the DB objects from solution explorer and push the solution to Git.
Create a pipeline includes steps as build solution/copy/publish artifact
Create a new release pipeline and use "Azure SQL Data Warehouse deployment" task and link DACPAC file (which is generated from above step dynamically)
Now, for incremental changes, my assumption is, Change the code-> upload in git->generate solution-> build release (the DACPAC file generated from build pipeline will be compared with current QA db and only new changes will be applied, behind the scenes, sqlpackage will be used to compare at release "Azure SQL Data Warehouse deployment task" )
Links I have gone thru:
Configure CD of Azure SQL database using Azure DevOps and Visual Studio
Please correct me if my understanding is wrong,
Thanks a ton,
A DevOps newbie here.
Azure DevOps services provide the Azure SQL database deployment task to deploy an Azure SQL database.
So the approach 2 is the common way. With the task we can deploy an Azure SQL Database using DACPAC or run scripts using SQLCMD.
You can also reference the following links:
Tutorial: Deploy your ASP.NET app and Azure SQL Database code by using Azure DevOps Starter
DevOps for Azure SQL
Azure SQL Database CI/CD with Azure DevOps
Related
Is this possible to migrate projects from one organization to another in Azure DevOps?
I am facing challenges to manage multiple organizations so instead of it I want to migrate all of the projects to a single organization.
Unfortunately, the requirement of migrating one organization project to another organization will not be achieved directly through DevOps. Instead there are few operations we can perform in Azure DevOps where we can implement to achieve the task up to some point.
There are different migration tools like:
Migration Tools for Azure DevOps : https://marketplace.visualstudio.com/items?itemName=nkdagility.vsts-sync-migration
OpsHub Azure DevOps Migrator (Formerly known as OpsHub Visual Studio Migration Utility) : https://www.opshub.com/products/opshub-azure-devops-migrator/
Instead, we can migrate the REPO by Built-in clone functions and export data through excel sheet
https://learn.microsoft.com/en-us/previous-versions/azure/devops/report/admin/create-status-and-trend-excel-reports?view=tfs-2017&viewFallbackFrom=azure-devops-2020
I am trying to determine how to backup the online ADO account that I created on Microsoft's servers so that I can restore it on my own physical server. I have a few projects already started along with work items, repositories, pipeline jobs and NuGet artifacts already in place. It would take quite a while to rebuild the projects manually, not impossible, just not desirable.
I have looked and have not found any resource as to how to perform this or if it is even possible. Any help from someone who knows would be greatly appreciated!
Currently there is available extension: Azure DevOps Migration Tools, which allow you to migrate Teams, Work Items, Plans & Suits, and Shared Queries, & Pipelines from one Project to another in Azure DevOps/TFS both within the same Organization, and between Organizations. See: https://nkdagility.github.io/azure-devops-migration-tools/ for latest guidance.
In addition, for repositories, there is no such extensions, you could try to clone an existing Git repo and then push it to a new remote repo server.
BTW, you could use Rest APIs: Artifact Details to get artifacts and then publish them to new feed on Azure DevOps Server.
I am trying to implement source control for database stored procedures. I am using Azure DevOps and I have added all the stored procedures to an Azure Repo.
I am connecting to the Azure Repo through the team explorer in SSMS. I am working on a branch and once the stored procedure changes are added to the master branch I want the changes to be automatically deployed to the production database.
Is this possible using an Azure pipeline?
Yes you can do it with DACPAC
You can follow the following links,
Use tools that help you integrate and deploy your database along your code. Build better apps on any client OS (Windows, Linux, and macOS) using our graphical user interface and command line tools.
DevOps using SQL Server
Continuous Deployment of SQL
The following link has step by step instructions as to how we can set up Azure pipeline to execute the changes from one database to another:
https://microsoft.github.io/PartsUnlimited/iac/200.2x-IaC-CDAzureSQLdbwithVSTSandVS.html
I have created Azure Data Factory with Copy Activity using C# and Azure SDK.
How can deploy it using CI/CD ?
Any URL or link will help
Data Factory continuous integration and delivery is now possible with directly through the web user interface using ARM Templates or even Git (Github or Azure DevOps).
Just click on "Set up Code Repository" and follow the steps.
Check the following link for more information, including a video demostration: https://aka.ms/azfr/401/02
One idea that I got from Microsoft was that using the same Azure SDK you could deserialize the objects and save down the JSON files following the official directory structure into your local GitHub/Git working directory
In other words you would have to mimic what the UI Save All/Save button does from the portal.
Then using Git bash, you can just commit and push to your working branch (i.e. develop) and from the UI you can just publish (this will create an adf_publish release branch with the ARM objects)
Official reference for CI using VSTS and the UI Publish feature: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Unfortunately, CI/CD for ADF is not very intuitive at first glance.
Check out this blog post where I'm describing what/how/why step by step:
Deployment of Azure Data Factory with Azure DevOps
Let me know if you have any questions or concerns and finally - if that works for you.
Good luck!
My resources on how to enable CI/CD using Azure DevOps and Data Factory comes from the Microsoft site below:
Continuous integration and delivery (CI/CD) in Azure Data Factory
I am still new to DevOps and CI/CD, but I do know that other departments had this set up and it looks to be working for them.
I want to run tests using real SQL databases. The SQL server and the databases would be on Azure. I'm trying to configure a build definition on VSTS where the tasks would set up an Azure resource group with the databases, set the connection strings on my solution and run the Entity Framework Core migrations to the databases. After the tests a task should delete the resource group.
I created an Azure resource group with the databases I need and downloaded the PowerShell deployment files, the script, template.json and so on.
The PowerShell script task can have an inline script or a path to the script. Should I add the Powershell scripts to the solution so that the VSTS could access them? The inline script option seems to be for small few line scripts and the Azure deployment is quite big with multiple files.
Setting up the connection strings shouldn't be too difficult. Several people suggest using the Replace Tokens task from the market place.
I'm not sure how to run the database migrations after that. Can I run the migrations on PowerShell script task? How can I ensure that the needed cmdlets work?
Since the script is quite big, you can’t use Inline script, you can add it to the project or other path of server (Add additional mapping for different path).
Regarding database migration, if you enabled EF migration in a web app, you can create a publish profile (web deploy package) with database migration enabled, then publish/deploy with /p:DeployOnBuild=true /p:PublishProfile=[profile name];DesktopBuildPackageLocation="$(build.artifactstagingdirectory)\webEF.zip" MSBuild arguments, then deploy package to Azure Web APP through Azure Web Deploy task.