I am trying to implement source control for database stored procedures. I am using Azure DevOps and I have added all the stored procedures to an Azure Repo.
I am connecting to the Azure Repo through the team explorer in SSMS. I am working on a branch and once the stored procedure changes are added to the master branch I want the changes to be automatically deployed to the production database.
Is this possible using an Azure pipeline?
Yes you can do it with DACPAC
You can follow the following links,
Use tools that help you integrate and deploy your database along your code. Build better apps on any client OS (Windows, Linux, and macOS) using our graphical user interface and command line tools.
DevOps using SQL Server
Continuous Deployment of SQL
The following link has step by step instructions as to how we can set up Azure pipeline to execute the changes from one database to another:
https://microsoft.github.io/PartsUnlimited/iac/200.2x-IaC-CDAzureSQLdbwithVSTSandVS.html
Related
I am trying to determine how to backup the online ADO account that I created on Microsoft's servers so that I can restore it on my own physical server. I have a few projects already started along with work items, repositories, pipeline jobs and NuGet artifacts already in place. It would take quite a while to rebuild the projects manually, not impossible, just not desirable.
I have looked and have not found any resource as to how to perform this or if it is even possible. Any help from someone who knows would be greatly appreciated!
Currently there is available extension: Azure DevOps Migration Tools, which allow you to migrate Teams, Work Items, Plans & Suits, and Shared Queries, & Pipelines from one Project to another in Azure DevOps/TFS both within the same Organization, and between Organizations. See: https://nkdagility.github.io/azure-devops-migration-tools/ for latest guidance.
In addition, for repositories, there is no such extensions, you could try to clone an existing Git repo and then push it to a new remote repo server.
BTW, you could use Rest APIs: Artifact Details to get artifacts and then publish them to new feed on Azure DevOps Server.
What are the industrial standards for developing a CI/CD pipelines for Azure SQL database?
I have an existing Azure SQL database (DEV instance, includes Schemas, Tables, Functions, Stored Procedures, etc. ) the code for these are hardcoded (meaning, not generated using SSDT compare nor generating script from existing table/SP/Function nor DACPAC/BACPAC file, it's just the code which developers wrote) and maintained in Git repo,
Now, my users want to create another Database using the scripts which were uploaded into Git by developers (Use the code which was uploaded by developers in Git (bitbucket), meaning identifying all the dependencies of DB objects and executing them in order to create new Database, Is this the correct approach? consider this as approach 1),
upon investing lots of time on deployments, I am confused/convinced that it is advised to follow below approach, let's call it as approach 2,
create a solution and clone your existing Git repo in Visual Studio
Import the DB objects from solution explorer and push the solution to Git.
Create a pipeline includes steps as build solution/copy/publish artifact
Create a new release pipeline and use "Azure SQL Data Warehouse deployment" task and link DACPAC file (which is generated from above step dynamically)
Now, for incremental changes, my assumption is, Change the code-> upload in git->generate solution-> build release (the DACPAC file generated from build pipeline will be compared with current QA db and only new changes will be applied, behind the scenes, sqlpackage will be used to compare at release "Azure SQL Data Warehouse deployment task" )
Links I have gone thru:
Configure CD of Azure SQL database using Azure DevOps and Visual Studio
Please correct me if my understanding is wrong,
Thanks a ton,
A DevOps newbie here.
Azure DevOps services provide the Azure SQL database deployment task to deploy an Azure SQL database.
So the approach 2 is the common way. With the task we can deploy an Azure SQL Database using DACPAC or run scripts using SQLCMD.
You can also reference the following links:
Tutorial: Deploy your ASP.NET app and Azure SQL Database code by using Azure DevOps Starter
DevOps for Azure SQL
Azure SQL Database CI/CD with Azure DevOps
I have created Azure Data Factory with Copy Activity using C# and Azure SDK.
How can deploy it using CI/CD ?
Any URL or link will help
Data Factory continuous integration and delivery is now possible with directly through the web user interface using ARM Templates or even Git (Github or Azure DevOps).
Just click on "Set up Code Repository" and follow the steps.
Check the following link for more information, including a video demostration: https://aka.ms/azfr/401/02
One idea that I got from Microsoft was that using the same Azure SDK you could deserialize the objects and save down the JSON files following the official directory structure into your local GitHub/Git working directory
In other words you would have to mimic what the UI Save All/Save button does from the portal.
Then using Git bash, you can just commit and push to your working branch (i.e. develop) and from the UI you can just publish (this will create an adf_publish release branch with the ARM objects)
Official reference for CI using VSTS and the UI Publish feature: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Unfortunately, CI/CD for ADF is not very intuitive at first glance.
Check out this blog post where I'm describing what/how/why step by step:
Deployment of Azure Data Factory with Azure DevOps
Let me know if you have any questions or concerns and finally - if that works for you.
Good luck!
My resources on how to enable CI/CD using Azure DevOps and Data Factory comes from the Microsoft site below:
Continuous integration and delivery (CI/CD) in Azure Data Factory
I am still new to DevOps and CI/CD, but I do know that other departments had this set up and it looks to be working for them.
We have a sandbox environment for clients to play in and we want to restore the database to a clean state every night.
I have not had any success in finding an example on how to set this up - easy to do this in a local environment using a SQL Agent Job - no idea how to accomplish this in Azure.
One way to accomplish this would be ARM template and a bacpac file. Your ARM template can be deployed with PowerShell or Azure CLI using cron.
Blog Post: https://blogs.msdn.microsoft.com/kaevans/2016/03/28/deploy-bacpac-to-azure-sql-database-using-arm/
I am developing CD release pipeline using TFS 2015 update 2 on-prem instance. I'm relying on ARM template to setup Azure website and Azure Sql server. I'm using FTP method for deploying website bits from internal Build server to Azure website. For this website deployment I'm reading the credentials from PublsihingProfile of the newly created website.
Is this a right way or can you suggest a better way? Any comments are appreciated.
P.S. Customer wants to use FTP method and not WebDeploy.
If you really have to use FTP, and the thing you're not happy with is the process/password secret management, you could try this:
https://marketplace.visualstudio.com/items?itemName=januskamphansen.ftpupload-task
Its a VSTS extension task for release, which works with the vnext build/release system in VSTS or TFS2015 server. This task lets you put the parameters in against each environment you setup, and mark the passwords as secrets so they wont come out in logs or the UI.
The step basically wraps up the process of doing the ftp bit for you - you may want to do other steps as part of the release.