We have a sandbox environment for clients to play in and we want to restore the database to a clean state every night.
I have not had any success in finding an example on how to set this up - easy to do this in a local environment using a SQL Agent Job - no idea how to accomplish this in Azure.
One way to accomplish this would be ARM template and a bacpac file. Your ARM template can be deployed with PowerShell or Azure CLI using cron.
Blog Post: https://blogs.msdn.microsoft.com/kaevans/2016/03/28/deploy-bacpac-to-azure-sql-database-using-arm/
Related
What are the industrial standards for developing a CI/CD pipelines for Azure SQL database?
I have an existing Azure SQL database (DEV instance, includes Schemas, Tables, Functions, Stored Procedures, etc. ) the code for these are hardcoded (meaning, not generated using SSDT compare nor generating script from existing table/SP/Function nor DACPAC/BACPAC file, it's just the code which developers wrote) and maintained in Git repo,
Now, my users want to create another Database using the scripts which were uploaded into Git by developers (Use the code which was uploaded by developers in Git (bitbucket), meaning identifying all the dependencies of DB objects and executing them in order to create new Database, Is this the correct approach? consider this as approach 1),
upon investing lots of time on deployments, I am confused/convinced that it is advised to follow below approach, let's call it as approach 2,
create a solution and clone your existing Git repo in Visual Studio
Import the DB objects from solution explorer and push the solution to Git.
Create a pipeline includes steps as build solution/copy/publish artifact
Create a new release pipeline and use "Azure SQL Data Warehouse deployment" task and link DACPAC file (which is generated from above step dynamically)
Now, for incremental changes, my assumption is, Change the code-> upload in git->generate solution-> build release (the DACPAC file generated from build pipeline will be compared with current QA db and only new changes will be applied, behind the scenes, sqlpackage will be used to compare at release "Azure SQL Data Warehouse deployment task" )
Links I have gone thru:
Configure CD of Azure SQL database using Azure DevOps and Visual Studio
Please correct me if my understanding is wrong,
Thanks a ton,
A DevOps newbie here.
Azure DevOps services provide the Azure SQL database deployment task to deploy an Azure SQL database.
So the approach 2 is the common way. With the task we can deploy an Azure SQL Database using DACPAC or run scripts using SQLCMD.
You can also reference the following links:
Tutorial: Deploy your ASP.NET app and Azure SQL Database code by using Azure DevOps Starter
DevOps for Azure SQL
Azure SQL Database CI/CD with Azure DevOps
I am trying to implement source control for database stored procedures. I am using Azure DevOps and I have added all the stored procedures to an Azure Repo.
I am connecting to the Azure Repo through the team explorer in SSMS. I am working on a branch and once the stored procedure changes are added to the master branch I want the changes to be automatically deployed to the production database.
Is this possible using an Azure pipeline?
Yes you can do it with DACPAC
You can follow the following links,
Use tools that help you integrate and deploy your database along your code. Build better apps on any client OS (Windows, Linux, and macOS) using our graphical user interface and command line tools.
DevOps using SQL Server
Continuous Deployment of SQL
The following link has step by step instructions as to how we can set up Azure pipeline to execute the changes from one database to another:
https://microsoft.github.io/PartsUnlimited/iac/200.2x-IaC-CDAzureSQLdbwithVSTSandVS.html
I have a build pipeline that is working pretty well currently in Azure DevOps. As part of the pipline/build process, I create an artifact, which is published and reachable. After that, I'd like to do the following:
Create/Start Up a new VM (Windows)
Grab the now published artifact, unzip it and run the executable within
Run the integration tests
Close the VM
I've looked around the Azure documentation but cannot find much that discusses this sort of solution. Please help!
There is nothing built-in (like a readymade task create a vm), so you can use any way to create a VM in Azure. Azure powershell, Azure Cli, ARM Templates, SDK calls. whatever works for you.
You would need to open ssh\winrm to talk to that vm to deploy stuff to it. thats about it. You can find lots of examples on how to create a VM online. VSTS got tasks for Azure Powershell\Cli\ARM Tempaltes so you dont need to handle auth.
You can create a VM using ARM templates with the task 'Azure Resource Group Deployment'
With a separate task 'Powershell on target machine' you can run a powershell script on the target VM, if you put the downloading, unzipping and running of this exe in this script you should be able to perform the tasks you need.
You could also look into the 'invoke-azurermvmruncommand' powershell command, this allows you to run a powershell script in the vm. https://learn.microsoft.com/en-us/powershell/module/azurerm.compute/invoke-azurermvmruncommand?view=azurermps-6.11.0
I want to run tests using real SQL databases. The SQL server and the databases would be on Azure. I'm trying to configure a build definition on VSTS where the tasks would set up an Azure resource group with the databases, set the connection strings on my solution and run the Entity Framework Core migrations to the databases. After the tests a task should delete the resource group.
I created an Azure resource group with the databases I need and downloaded the PowerShell deployment files, the script, template.json and so on.
The PowerShell script task can have an inline script or a path to the script. Should I add the Powershell scripts to the solution so that the VSTS could access them? The inline script option seems to be for small few line scripts and the Azure deployment is quite big with multiple files.
Setting up the connection strings shouldn't be too difficult. Several people suggest using the Replace Tokens task from the market place.
I'm not sure how to run the database migrations after that. Can I run the migrations on PowerShell script task? How can I ensure that the needed cmdlets work?
Since the script is quite big, you can’t use Inline script, you can add it to the project or other path of server (Add additional mapping for different path).
Regarding database migration, if you enabled EF migration in a web app, you can create a publish profile (web deploy package) with database migration enabled, then publish/deploy with /p:DeployOnBuild=true /p:PublishProfile=[profile name];DesktopBuildPackageLocation="$(build.artifactstagingdirectory)\webEF.zip" MSBuild arguments, then deploy package to Azure Web APP through Azure Web Deploy task.
I am developing CD release pipeline using TFS 2015 update 2 on-prem instance. I'm relying on ARM template to setup Azure website and Azure Sql server. I'm using FTP method for deploying website bits from internal Build server to Azure website. For this website deployment I'm reading the credentials from PublsihingProfile of the newly created website.
Is this a right way or can you suggest a better way? Any comments are appreciated.
P.S. Customer wants to use FTP method and not WebDeploy.
If you really have to use FTP, and the thing you're not happy with is the process/password secret management, you could try this:
https://marketplace.visualstudio.com/items?itemName=januskamphansen.ftpupload-task
Its a VSTS extension task for release, which works with the vnext build/release system in VSTS or TFS2015 server. This task lets you put the parameters in against each environment you setup, and mark the passwords as secrets so they wont come out in logs or the UI.
The step basically wraps up the process of doing the ftp bit for you - you may want to do other steps as part of the release.