Azure SQL database availability after creation - azure

I'm trying to configure an Azure pipeline where I create a copy of a production database to create a "pre-prod" environment.
After create that database I need to run some queries in the freshly created database. The problem is database is not available right away. As the process is automatic I need to know for how long I need to wait.
I put a wait step for 5 minutes but sometimes is not enough.
Thanks in advance

How about using a simple check of DB availability, through Az module, or CLI?
do {
sleep 120
$status = "Offline"
try{
$status = (Get-AzSqlDatabase -ResourceGroupName "resourcegroup01" -ServerName "server01"-DatabaseName "MyDataBase").Status
}
catch
{
"Database not available yet"
}
} while ($status -ne "Online")

You can try to use the Azure portal's query editor to query an Azure SQL Database to check DB availability.
The query editor is a tool in the Azure portal for running SQL queries against your database in Azure SQL Database or data warehouse in Azure Synapse Analytics.
Note: The query editor uses ports 443 and 1443 to communicate. Ensure you have enabled outbound HTTPS traffic on these ports. You also need to add your outbound IP address to the server's allowed firewall rules to access your databases and data warehouses.
For more query editor considerations, please refer to this.

Related

Finding the originator of a database query using SQL Azure Data Tools Profiler or related

We've got a rogue process running somewhere that executes queries against a test database we have hosted on Azure SQL. I'm trying find this process so I can kill it. There are a number of app servers and development PCs where it could be hiding, and I haven't been able to track it down by looking at processes running on these machines by hand.
I can use The Azure Data Studio Profiler extension to get some Extended Event logging from the database. From there, I can see the text of queries being run, the Application Name and the ClientProcessID.
Sample data from profiler
I can't seem to use any of this data to find the host name or ip address of the server where these queries originate. Can I determine this using the data available in Azure Data Tools Profiler? Or is there some other way to work backward to find it? Since this is hosted on Azure, I can't use the Sql Management Studio Profiler, which I think would give me the Hostname right away.
Azure SQL Auditing should provide you the application name, login name and client IP address that executed the query. Please read this article to enable Auditing and look for the event type BATCH_COMPLETED.
Set-AzureRmSqlDatabaseAuditing `
-State Enabled `
-ResourceGroupName "rgYourResourceGroup" `
-ServerName "yourservername" `
-StorageAccountName "StorageAccountForAuditing" `
-DatabaseName "YourDatabaseName"

How to configure the alert whenever the primary Azure SQL Database goes down?

In Azure, how can you configure an alert or email notification when a SQL Server failover happened whenever you have setup a SQL server with Failover groups and failover policy is set to automatic?
So, can anyone suggest me how to configure the alert for the above scenario?
You reference this blog: In Azure, how can you configure an alert or notification when a SQL Server failover happened?
CKelly found a way to script this in Azure using Automation Accounts > Runbook > using Powershell. A simple script like this should do it. Just need to figure out the run as account and trigger it by schedule or alert.:
function sendEmailAlert
{
# Send email
}
function checkFailover
{
$list = Get-AzSqlDatabaseFailoverGroup -ResourceGroupName "my-resourceGroup" -server "my-sql-server"
if ( $list.ReplicationRole -ne 'Primary')
{
sendEmailAlert
}
}
checkFailover
Hope this helps.

How to correctly set firewall rules for RDMS external data sources

When setting a RDMS extenal data source and table like the following:
CREATE EXTERNAL DATA SOURCE preview
WITH
(
TYPE=RDBMS,
LOCATION= 'xxxxxxxx.database.windows.net',
DATABASE_NAME = 'preview',
CREDENTIAL= preview
)
GO
CREATE SCHEMA preview;
GO
GO
CREATE EXTERNAL TABLE preview.entity_types
(
entity_type_id int NOT NULL,
entity_type_name varchar(128) NOT NULL
)
WITH (DATA_SOURCE=preview);
GO
If you execute a SELECT statement like the following:
SELECT *
FROM preview.entity_types
You receive an error like this:
Error retrieving data from xxxxxxxx.database.windows.net.preview. The
underlying error message received was: 'Cannot open server 'xxxxxxxx'
requested by the login. Client with IP address 'xxx.xxx.xxx.xxx' is
not allowed to access the server. To enable access, use the Windows
Azure Management Portal or run sp_set_firewall_rule on the master
database to create a firewall rule for this IP address or address
range. It may take up to five minutes for this change to take
effect.'.
I know how to set a firewall rule to enable access for client ip address 'xxx.xxx.xxx.xxx'. But I think that IP is dynamic, it may change, therefore some day will stop working.
So, Which is the correct way to allow access to the client SQL Azure database?
If you are configuring elastic queries (cross database queries) in Azure SQL Database, it should work if you set "Allow access to Azure Services" to "On" on Azure SQL Database firewall rules. This way elastic queries won't fail when Azure SQL Databases change their IP address.
A more restricted way to configure Azure SQL Database firewall to allow elastic queries currently does not exist. You can vote for this feature here so it can be considered in the future by Azure SQL Database team.

Copy azure database from one subscription to another on azure portal [duplicate]

We are about to split our testing and production instances in Windows Azure into two separate subscriptions. Currently we have 3 Windows Azure SQL Database instances that reside within the same subscription:
Production
Reporting
Testing
In order to completely isolate production we are splitting these into:
Production Subscription
Production
Reporting
Testing Subscription
Testing
At the moment we use the CREATE DATABASE X AS COPY OF [ServerName].Y command to copy databases from production to testing before we obfuscate the live data. This operation can be performed so long as the databases are geo-located in the same data centre and we have a shared login across the instances that created the database in the first place (As indicated by this article).
However; the article does not indicate whether the source and destination instance need to belong to the same subscription. Are we able to copy the database between the production subscription and testing subscription (And vica verca) assuming we use a consistent login?
You can just do a backup (Export) to blob storage and then Import it in the new subscription.
http://msdn.microsoft.com/en-us/library/f6899710-634e-425a-969d-8db1267e9471
Update:
If you can use SSMS, this answer is right. I only want to add some details.
You can export the source database into storage in Azure Portal.
After exporting, you can find the bacpac file.
Open SSMS, and connect to the destination server.
Right click the node Database and select Import Data-tier Application
Then you can choose import the database from local disk or Azure storage.
After that, you have copied the database from source to destination.
For anyone landing here, it does appear to be possible to use CREATE DATABASE newDB AS COPY OF [server].[olddb] ( OPTION [, OPTION ...] ) even when the servers are in different subscriptions.
See more at Create Database (Azure SQL Database) - MSDN
Example from MS Docs:
CREATE DATABASE db_copy AS COPY OF ozabzw7545.db_original ( SERVICE_OBJECTIVE = 'P2' ) ;
In my setup I have the admin account and password (login) the same on both servers - that probably helps.
Operation will fail if you don't have admin permissions on original server.
I have found through testing that I am not able to change the Edition from Standard to Premium despite including the 'Edition' option - I'm not sure why that is.
I have created copies of databases across Azure subscriptions successfully.
Here are the steps -
On the target Azure subscription create a database server (if you haven't already created one), and on that create a new DB (any name, doesn't matter) but with the same password as the source database on your source Azure subscription. For me it didn't work with different passwords, so I just went ahead with using the same, but I am sure there is a way to make it work with different passwords as well.
Run this on the newly created database in your target Azure -
CREATE DATABASE NEWDBNAME
AS COPY OF [Source Azure Server Name here].[source DB]
Let Azure handle the new DB pricing tier (Basic, Standard etc), because you can immediately change it from the portal after the DB is created. In my case the target DB was created with the same pricing tier as the source DB.
Also, the server names in azure are usually - NAME.database.windows.net. So in your source name above, just put NAME.
Now on your target Azure subscription you will have 2 databases on the new DB server. One which was created in step 1 and the other in step 2 which is the actual copy. You can go ahead and safely delete the one which you don't need.
If you want to copy other source DBs to the same target server created in 1 above, just run the same command again.
I guess you already have a solution, however for anyone landing here, you can use the Azure PowerShell API's to Create a new server in the source subscription, create a copy and switch over the new server to the destination subscription
Sample code is available on technet
The code is self explanatory, however in the interest of SO best practices,
Key portions of the code are
Create a new server:
$newserver = New-AzureSqlDatabaseServer -Location $targetServerLocation -AdministratorLogin $targetServerLoginID -AdministratorLoginPassword $targetServerLoginPassword
Create a database copy:
Start-AzureSqlDatabaseCopy -ServerName $sourceServerName -DatabaseName $sourceDatabaseName -PartnerServer $newserver.ServerName -PartnerDatabase $targetdatabaseName
Transfer the server
$uri = "https://management.core.windows.net:8443/" + $sourceSubscriptionID + "/services" + "/sqlservers/servers/" + $newserver.ServerName + "?op=ChangeSubscription"
Invoke-RestMethod -Uri $uri -CertificateThumbPrint $certThumbprint -ContentType $contenttype -Method $method -Headers $headers -Body $body
You can do this in SSMS on the target server using
CREATE DATABASE db1 AS COPY OF sourcesrv.db1
to copy from sourcesrv.database.windows.net which is in a different subscription.
However, you must first check you can connect in SSMS to the SOURCE server too, or you will get a totally confusing error message which hides the actual problem.
The source server may be one you regularly connect to, but not from the IP address you're currently on. In that case you must add the IP to the server's firewall rules. This is easily done using the dialog that appears when you try to connect from SSMS:
Leave the default radiobutton checked ("Add my client IP") and press OK.
If you omit this check and it fails to authenticate you, instead of telling you the correct reason as above, it tells you you can't make a copy on the SOURCE server!
--In SSMS connected to targetsrv:
CREATE DATABASE db1 AS COPY OF sourcesrv.db1
--Here it should say, "Your client IP address does not have access" to sourcesrv, like when
--you try to connect in SSMS. Instead, it says you can't copy to the SOURCE, even though you
--told it to copy FROM the source to the TARGET:
--Msg 45137, Level 16, State 1, Line 7
--Insufficient permission to create a database copy on server 'sourcesrv'.
Note that at the time of writing, the two servers need to be configured with the same admin credentials, otherwise the CREATE DATABASE command will fail with this same confusing error message.
I understand that this is quite old question and still wanted to add yet another option.
If you want to have this task automated, you do not want to do it manually (Export-Import), you want to copy the database to the existing server (and not create a new temporary one that will be moved across subscriptions) and you do not want to have same credentials on source and target servers because of security considerations, you can use ARM.
There is a option to create database as copy ("createMode": "Copy",) and it will make it across subscriptions! Simple example:
{
"$schema": "http://schema.management.azure.com/schemas/2014-04-01-preview/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
},
"resources": [
{
"apiVersion": "2014-04-01-preview",
"location": "australiaeast",
"name": "{DESTINATION-SERVER-NAME}/{DESTINATION-DATABASE-NAME}",
"properties": {
"createMode": "Copy",
"sourceDatabaseId": "/subscriptions/{SOURCE-SUBSCRIPTION-ID}/resourceGroups/{SOURCE-RESOURCE-GROUP-NAME}/providers/Microsoft.Sql/servers/{SOURCE-SERVER-NAME}/databases/{SOURCE-DATABASE-NAME}",
"requestedServiceObjectiveName": "S2"
},
"type": "Microsoft.Sql/servers/databases"
}
]
}
Two things to note - the service principal that will be executing this deployment will need to be have a contributor access on source and the sourceDatabaseId is a full resource id.
If you do it from Azure DevOps using "Azure resource group deployment" task - it will create a SP for subscription. You will need give Contributor access to it. SP can be found in Project Settings -> Service Connections.
There is a more simple solution, which maybe wasn't available when this question was answered. No SMSS or PowerShell needed. It can all be done in the portal. Go to the source SQL Database and click Export. This will create a .bacpac file in Azure Storage. Go to the target SQL Server and click Import. Done.
Note 1: if the target SQL Sever is in a different account/subscription that cannot access the source account's Azure Storage, just manually download the file from the source Azure Storage and upload it to an Azure Storage instance that the target can access.
Note 2: the imported database will have a name that includes the export date. You can change the name by running ALTER DATABASE [dbname] MODIFY NAME = [newdbname] on the target database. You can even do this in the portal using the new Query Editor.

How to connect to a SQL Azure DB from a hosted build server for running tests

We wish to implement CI using a TFS / Visual Studio Online-hosted build server. To run our unit/integration tests the build server needs to connect to a SQL Azure DB.
We've hit a stumbling block here because SQL Azure DBs use an IP address whitelist.
My understanding is that the hosted build agent is a VM which is spun-up on demand, which almost certainly means that we can't determine its IP address beforehand, or guarantee that it will be the same for each build agent.
So how can we have our hosted build agent run tests which connect to our IP-address-whitelisted SQL DB? Is it possible to programmatically add an IP to the whitelist and then remove it at the end of testing?
After little research found this (sample uses PowerShell):
Login to your azure account
Select relevant subscription
Then:
New-AzureRmSqlServerFirewallRule -EndIpAddress 1.0.0.1 -FirewallRuleName test1 -ResourceGroupName testrg-11 -ServerName mytestserver111 -StartIpAddress 1.0.0.0
To remove it:
Remove-AzureRmSqlServerFirewallRule -FirewallRuleName test1 -ServerName mytestserver111 -ResourceGroupName testrg-11 -Force
Found in Powershell ISE for windows. Alternatively there should be something similar using cross platform cli if not running on windows machine
There is the task/step of Azure PowerShell that you can call azure powershell (e.g. New-AzureRmSqlServerFirewallRule)
On the other hand, you can manage server-level firewall rules through REST API, so you can custom build/release task to get necessary information (e.g. authentication) of selected Azure Service Endpoint, then send the REST API to add new or remove firewall rules.
The SqlAzureDacpacDeployment task has the source code to add firewall rules through REST API that you can refer to. Part SqlAzureDacpacDeployment source code, VstsAzureRestHelpers_.psm1 source code.
There now is a "Azure SQL InlineSqlTask" build task which u can use to automatically set firewall rules on the Azure server. Just make sure "Delete Rule After Task Ends" is not checked. And just add some dummy query like "select top 1 * from...." as "Inline SQL Script"

Resources