Azure DevOps - Multitenant database deployment - azure

We have Web application with one code and multitenant database behind and we are using Azure Devops CI/CD for code deployment. We are able to release the code to database perfectly, however we want to deploy it to multiple database without multiple release pipelines (number of tenants is dynamical and 100+ of them).
I am able to deploy the code to one database and then redeploy to each other using sqlpackage.exe and dacpac file publish in command line on target server. But I want to do so just in Azure devops, however there is no sqlpackage.exe on agent machine in command line task.
Is here someone who was dealing with similar issue with multi-tenant DB approach? I am open to whatever solution using Azure release pipelines.

I was able to resolve this by running Inline Powershell and SqlPackage.exe on Deployment group. You just need to have one "Template DB", get dacpac or bacpac file file from it and then publish it to all other DBs.
Example for a reference:
ForEach ($a in (Get-Content "E:\DatabaseSync\databaselist.txt"))
{
SqlPackage.exe /a:Publish /sf:"E:\DatabaseSync\TemplateDB.dacpac" `
/tsn:localhost /tu:username /tp:password /tdn:$a `
/p:TreatVerificationErrorsAsWarnings=False /p:BlockOnPossibleDataLoss=True `
/p:AllowDropBlockingAssemblies=True /p:DropObjectsNotInSource=True `
/p:DropPermissionsNotInSource=False /p:IgnorePermissions=True `
/p:IgnoreRoleMembership=True /p:IgnoreUserSettingsObjects=True `
/p:DoNotDropObjectTypes=Users /p:ExcludeObjectTypes=Users
}

Related

How to manipulate remote Terraform state files on Azure Blob storage

I'm working with a subscription that has a few different deployed environments (dev, test, staging, etc.). Each environment has its own storage account, containing an associated Terraform state file. These environments get deployed via Azure DevOps Pipelines.
It's easy enough to get at the .tfstate files that have been created this way, through the portal, CLI, etc.
But is it possible to access these state files using the 'terraform state' commands, for example using Azure Cloud Shell? If so, how do you point them at the right location?
I've tried using the terraform state commands in a Cloud Shell, but it's not clear how to point them to the right location or if this is indeed possible.
For these requirement, you need AzurePowerShell task to achieve your requirement.
1, First, if you can achieve your requirement via powershell feature in azure portal, then it is possible using the AzurePowerShell task to achieve the same thing(AzurePowerShell is running on the agent based on the service connection/service principal you provided.).
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD' #This service connection related to service principal on Azure side.
ScriptType: 'InlineScript'
Inline: |
# Put your logic here.
# Put your logic here.
azurePowerShellVersion: 'LatestVersion'
2, Second, you can use AzCopy to download the file and then do operations to it. DevOps microsoft host agent support this tool.
running this command : terraform state pull > state.tfstate (you can give like thils dev.tfstate extension tfstate is important)in the Azure cloud shell.
All you need to move to the terraform file directory
enter image description here
and run this command terraform state pull > dev.tfstate
enter image description here

Azure Pipeline Data Sync

We have some databases(10 databses) in dev and test environments on Azure SQL. We would like to be able to sync the data between environments using the azure pipeline.
The Schema changes happen automagically with entity framework migrations which is fine.
For syncing data, We've created a data compare in visual studio that we use to sync the data on demand. Now we would like to automate this process(syncing data).
Is there an existing task that we can add to the pipeline to run this data compare and subsequent sync?
You can do that with SQL Data Sync for Azure.
And what it does:
SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple databases, both on-premises and in the cloud.
Fir you need to:
create sync group - on your prod database
use sync members - here you will use your test database
than configure group sync
What is cool about that you can select what particularly should be synced:
You can then trigger that from azure pipelines calling
Start-AzSqlSyncGroupSync -ResourceGroupName $resourceGroupName -ServerName $serverName -DatabaseName $databaseName -SyncGroupName $syncGroupName
from Azure PowerShell task. If you define cron schedule you will get you data up to date.

How to run a remote command (powershell/bash) against an existing Azure VM in Azure Data Factory V2?

I've been trying to find a way to run a simple command against one of my existing Azure VMs using Azure Data Factory V2.
Options so far:
Custom Activity/Azure Batch won't let me add existing VMs to the pool
Azure Functions - I have not played with this but I have not found any documentation on this using AZ Functions.
Azure Cloud Shell - I've tried this using the browser UI and it works, however I cannot find a way of doing this via ADF V2
The use case is the following:
There are a few tasks that are running locally (Azure VM) in task scheduler that I'd like to orchestrate using ADF as everything else is in ADF, these tasks are usually python applications that restore a SQL Backup and or purge some folders.
i.e. sqdb-restore -r myDatabase
where sqldb-restore is a command that is recognized locally after installing my local python library. Unfortunately the python app needs to live locally in the VM.
Any suggestions? Thanks.
Thanks to #martin-esteban-zurita, his answer helped me to get to what I needed and this was a beautiful and fun experiment.
It is important to understand that Azure Automation is used for many things regarding resource orchestration in Azure (VMs, Services, DevOps), this automation can be done with Powershell and/or Python.
In this particular case I did not need to modify/maintain/orchestrate any Azure resource, I needed to actually run a Bash/Powershell command remotely into one of my existing VMs where I have multiple Powershell/Bash commands running recurrently in "Task Scheduler".
"Task Scheduler" was adding unnecessary overhead to my data pipelines because it was unable to talk to ADF.
In addition, Azure Automation natively only runs Powershell/Python commands in Azure Cloud Shell which is very useful to orchestrate resources like turning on/off Azure VMs, adding/removing permissions from other Azure services, running maintenance or purge processes, etc, but I was still unable to run commands locally in an existing VM. This is where the Hybrid Runbook Worker came into to picture. A Hybrid worker group
These are the steps to accomplish this use case.
1. Create an Azure Automation Account
2. Install the Windows Hybrid Worker in my existing VM . In my case it was tricky because my proxy was giving me some errors. I ended up downloading the Nuget Package and manually installing it.
.\New-OnPremiseHybridWorker.ps1 -AutomationAccountName <NameofAutomationAccount> -AAResourceGroupName <NameofResourceGroup>
-OMSResourceGroupName <NameofOResourceGroup> -HybridGroupName <NameofHRWGroup>
-SubscriptionId <AzureSubscriptionId> -WorkspaceName <NameOfLogAnalyticsWorkspace>
Keep in mind that in the above code, you will need to find your own parameter values, the only parameter that does not have to be found and will be created is HybridGroupName this will define the name of the Hybrid Group
3. Create a PowerShell Runbook
[CmdletBinding()]
Param
([object]$WebhookData) #this parameter name needs to be called WebHookData otherwise the webhook does not work as expected.
$VerbosePreference = 'continue'
#region Verify if Runbook is started from Webhook.
# If runbook was called from Webhook, WebhookData will not be null.
if ($WebHookData){
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
# $WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody
# Collect individual headers. Input converted from JSON.
$Input = (ConvertFrom-Json -InputObject $WebhookBody)
# Write-Verbose "WebhookBody: $Input"
#Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
}
else
{
Write-Error -Message 'Runbook was not started from Webhook' -ErrorAction stop
}
#endregion
# This is where I run the commands that were in task scheduler
$callBackUri = $Input.callBackUri
# This is extremely important for ADF
Invoke-WebRequest -Uri $callBackUri -Method POST
4. Create a Runbook Webhook pointing to the Hybrid Worker's VM
4. Create a webhook activity in ADF where the above PowerShell runbook script will be called via a POST Method
Important Note: When I created the webhook activity it was timing out after 10 minutes (default), so I noticed in the Azure Automation Account that I was actually getting INPUT data (WEBHOOKDATA) that contained a JSON structure with the following elements:
WebhookName
RequestBody (This one contains whatever you add in the Body plus a default element called callBackUri)
All I had to do was to invoke the callBackUri from Azure Automation. And this is why in the PowerShell runbook code I added Invoke-WebRequest -Uri $callBackUri -Method POST. With this, ADF was succeeding/failing instead of timing out.
There are many other details that I struggled with when installing the hybrid worker in my VM but those are more specific to your environment/company.
This looks like a use case that is supported with Azure Automation, using a hybrid worker. Try reading here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
You can call runbooks with webhooks in ADFv2, using the web activity.
Hope this helped!

How to connect to a SQL Azure DB from a hosted build server for running tests

We wish to implement CI using a TFS / Visual Studio Online-hosted build server. To run our unit/integration tests the build server needs to connect to a SQL Azure DB.
We've hit a stumbling block here because SQL Azure DBs use an IP address whitelist.
My understanding is that the hosted build agent is a VM which is spun-up on demand, which almost certainly means that we can't determine its IP address beforehand, or guarantee that it will be the same for each build agent.
So how can we have our hosted build agent run tests which connect to our IP-address-whitelisted SQL DB? Is it possible to programmatically add an IP to the whitelist and then remove it at the end of testing?
After little research found this (sample uses PowerShell):
Login to your azure account
Select relevant subscription
Then:
New-AzureRmSqlServerFirewallRule -EndIpAddress 1.0.0.1 -FirewallRuleName test1 -ResourceGroupName testrg-11 -ServerName mytestserver111 -StartIpAddress 1.0.0.0
To remove it:
Remove-AzureRmSqlServerFirewallRule -FirewallRuleName test1 -ServerName mytestserver111 -ResourceGroupName testrg-11 -Force
Found in Powershell ISE for windows. Alternatively there should be something similar using cross platform cli if not running on windows machine
There is the task/step of Azure PowerShell that you can call azure powershell (e.g. New-AzureRmSqlServerFirewallRule)
On the other hand, you can manage server-level firewall rules through REST API, so you can custom build/release task to get necessary information (e.g. authentication) of selected Azure Service Endpoint, then send the REST API to add new or remove firewall rules.
The SqlAzureDacpacDeployment task has the source code to add firewall rules through REST API that you can refer to. Part SqlAzureDacpacDeployment source code, VstsAzureRestHelpers_.psm1 source code.
There now is a "Azure SQL InlineSqlTask" build task which u can use to automatically set firewall rules on the Azure server. Just make sure "Delete Rule After Task Ends" is not checked. And just add some dummy query like "select top 1 * from...." as "Inline SQL Script"

Cannot connect to DocumentDb directly after having deployed the DocumentDb account

I have an ARM template that I use to deploy a DocumentDB as well as other Azure reosurces to a resource group. I want my ARM template to setup a Stream Analytics job that uses the DocumentDB as output. In order to do this the DocumentDB account created by the ARM template needs to have a database and a collection setup as well. I cannot find a way to do this from an ARM template so I have written a Powershell CmdLet to create the database and collction for me.
The Stream Analytics job cannot be created by the first ARM template since it depends on having the database and collection created first. Instead I have to divide the deployment into two ARM templates, the first setting up the DocDb account and the second setting up the SA job.
The problem is that I cannot create a database in the DocDB account directly after having deployed the account via the ARM template. I get an exception with the following message: "The remote name could not be resolved: 'test.documents.azure.com'" when I try to execute the CreateDatabaseAsync method with the DocDbEndpoint and AuthKey I get back from the ARM template deployment.
Are there any timing issues after having deployed Azure resources using a ARM template before you can access them programatically? This do not seem to be a problem with other Azure reosurces created this way.
Any help on this matter is highly appreciated as well as what is a good practice for working with ARM templates with DocumentDB and Stream Analytic jobs.
Update 2016-03-23
Code for setting up the connection to the DocumentDB to create the database.
Uri endpointUri = new Uri(documentDbEndPoint);
DocumentClient client = new DocumentClient(endpointUri, authKey);
var db = await client.CreateDatabaseAsync(new Database { Id = databaseId });
return db;
Where the documentDbEndPoint is in the form of: https://name.documents.azure.com:443/ and name is the name of my DocDB account just created by the ARM template deployment.
I have the code in a library which I can either call from a Console application or from a Powershell script by loading the library with:
Add-Type -Path <path to library dll file>
No matter if I use powershell or console application I get the same error if I try to create a database just after having created the DocDB account using the ARM template. If I wait like an hour or so both the powershell script and console application works and can create a database in the account.
Seems like there is some kind of timing issue in order for Azure to setup dns records for the newly created DocDB account so that it can be accessed using the DocDB API.
Update 2 2016-03-23
Just tried to create a DocDB account directly from the portal and doing this instead of creating it from an ARM template makes it possible to create a database in the account using my powershell script and console application immediately.
This timing issue has been fixed now and you should be able to use it from the ARM template now.

Resources