I am creating a simple Azure logic app that uses a function to:
Delete a slave database
Restore a copy of a master database (with the same name as the removed slave)
Remove Database
# Remove slave database
Remove-AzSqlDatabase `
-DatabaseName $RestoreDatabaseName `
-ServerName $ServerName `
-ResourceGroupName $ResourceGroupName
Write-Host "Removed slave database"
Restore PIT Backup of Master
# Restore database
Restore-AzSqlDatabase `
-FromPointInTimeBackup `
-PointInTime (Get-Date).AddMinutes(-2) `
-ResourceGroupName $ResourceGroupName `
-ServerName $ServerName `
-TargetDatabaseName $RestoreDatabaseName `
-ResourceId $Database.ResourceID `
-ElasticPoolName $ElasticPoolName
The issue i am having is that after removing the database, Azure still sees the database on the server and so i get the following error when restoring:
The destination database name 'Slave' already exists on the server
'server address'.
I cant find any way to check if this has been fully removed before starting the next function. Any help on how to achieve this would be greatly appreciated.
You can use Get-AzSqlDatabase to check if the DB is still in play.
Get-AzSqlDatabase -ResourceGroupName "ResourceGroup01" -ServerName "Server01" -DatabaseName "Database02"
Placing this in a loop with a sleep will give you a poll to check when the DB is finally gone for good and you can then resume your processing.
Start-Sleep -s 15
Make sure you have a circuit breaker in your logic to prevent and endless loop in the case of a failed deletion.
It may be easier to restore your DB with a new name to avoid the delay e.g. MyDb<yyyymmdd>
Or alternatively, use the Azure REST API from SQL DB delete.
DELETE https://management.azure.com/subscriptions/00000000-1111-2222-3333-444444444444/resourceGroups/Default-SQL-SouthEastAsia/providers/Microsoft.Sql/servers/testsvr/databases/testdb?api-version=2017-10-01-preview
and monitor the location response of the 204 Accepted to determine when the database has been completely removed. Azure Durable Functions give you a great monitor pattern you can use.
Related
I'm trying to convert from New-AzureRmSqlDatabaseCopy to New-AzSqlDatabaseCopy, but I'm getting an error when trying to copy a database to the same server:
Microsoft.Rest.Azure.CloudException: Long running operation failed
with status 'Failed'. Additional Info:'The sku 'ElasticPool' specified
is invalid.'
New-AzSqlDatabaseCopy `
-ResourceGroupName $resourceGroupName `
-ServerName $serverName `
-DatabaseName $templateDbName `
-CopyDatabaseName $newDbName `
-CopyServerName $serverName `
-CopyResourceGroupName $resourceGroupName
The database copies successfully when using New-AzureRmSqlDatabaseCopy, I can't figure why New-AzSqlDatabaseCopy would be different. I've tried specifying the -ElasticPoolName and -ServiceObjectName parameters, but no luck.
I don't know if this is relevant, but I'm running the PowerShell from an Azure runbook.
Make sure that the new Elastic Pool has the same name and as the old Elastic Pool. Also make sure the SKU are same for both the elastic pools. If the resource groups are in different regions, the SKU (Standard?) for elastic pools may differ causing issues while copying the database.
I have 2 SQL servers under the same subscription in azure, both have an azure synapse DB on them. I want to move/copy one of the synapse DB's to the other server. I cannot find any documentation to do this online, all the stuff I have found refers to a normal SQL DB and the copy TSQL or the method doesn't seem to work. Can anyone refer me to an article explaining how I do this, or explain how I do this in azure?
Kind Regards
Glyn
The simplest way to do this is going to be through restore points, similar to how you might do it with a backup file in SQL Server. You can create a restore point either in the portal or via Powershell.
$SubscriptionName="<YourSubscriptionName>"
$ResourceGroupName="<YourResourceGroupName>"
$ServerName="<YourServerNameWithoutURLSuffixSeeNote>" # Without database.windows.net
$DatabaseName="<YourDatabaseName>"
$Label = "<YourRestorePointLabel>"
Connect-AzAccount
Get-AzSubscription
Select-AzSubscription -SubscriptionName $SubscriptionName
# Create a restore point of the original database
New-AzSqlDatabaseRestorePoint -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName -RestorePointLabel $Label
Once the point is created it is possible to restore it to a different server, again using the portal or via Powershell script. They don't need to be in the same resource group for this to work.
$SubscriptionName="<YourSubscriptionName>"
$ResourceGroupName="<YourResourceGroupName>"
$ServerName="<YourServerNameWithoutURLSuffixSeeNote>" # Without database.windows.net
#$TargetResourceGroupName="<YourTargetResourceGroupName>" # uncomment to restore to a different server.
#$TargetServerName="<YourtargetServerNameWithoutURLSuffixSeeNote>"
$DatabaseName="<YourDatabaseName>"
$NewDatabaseName="<YourDatabaseName>"
Connect-AzAccount
Get-AzSubscription
Select-AzSubscription -SubscriptionName $SubscriptionName
# Or list all restore points
Get-AzSqlDatabaseRestorePoint -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName
# Get the specific database to restore
$Database = Get-AzSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName
# Pick desired restore point using RestorePointCreationDate "xx/xx/xxxx xx:xx:xx xx"
$PointInTime="<RestorePointCreationDate>"
# Restore database from a restore point
$RestoredDatabase = Restore-AzSqlDatabase –FromPointInTimeBackup –PointInTime $PointInTime -ResourceGroupName $Database.ResourceGroupName -ServerName $Database.ServerName -TargetDatabaseName $NewDatabaseName –ResourceId $Database.ResourceID
# Use the following command to restore to a different server
#$TargetResourceGroupName = $Database.ResourceGroupName # for restoring to different server in same resourcegroup
#$RestoredDatabase = Restore-AzSqlDatabase –FromPointInTimeBackup –PointInTime $PointInTime -ResourceGroupName $TargetResourceGroupName -ServerName $TargetServerName -TargetDatabaseName $NewDatabaseName –ResourceId $Database.ResourceID
# Verify the status of restored database
$RestoredDatabase.status
This doesn't cover things like server firewall settings, so if you want those to match you'll have to move that separately. The server is Azure SQL, so guides for migrating Azure SQL firewalls should also apply to Synapse dedicated pools.
If the number of tables is small, then the copy activity in Pipelines might be a good backup option, but since it requires one activity per table, setting it up for a large database will be significantly more complex than using restore points.
I'm new to Azure and I have been asked to generate a daily report of added/changed databases, table or columns in Azure SQL Server. Specifically, we want to know when new datbases are created, new tables are added or existing ones altered and the same for columns. In non-Azure SQL Server you can use the Schema Changes History in standard reports but this is not available in Azure.
I have seen some suggestions you can do this with Extended Events but I've not found any resources to show how this can be done. Any help would be appreciated.
DDL Triggers can really help you to keep track schema changes on your Azure SQL Database. Below is an example:
CREATE TRIGGER safety
ON DATABASE
FOR DROP_TABLE, ALTER_TABLE
AS
PRINT 'Save change on a log'
SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)');
Database Auditing can help with schema changes and dropped objects. Below an example:
Set-AzureRmSqlDatabaseAuditing `
-State Enabled `
-ResourceGroupName "resourcegroupname" `
-ServerName "ssqlinstancename" ` #ssqlinstancename.database.windows.net
-StorageAccountName "strageaccountname" `
-DatabaseName "dbname" `
-AuditActionGroup 'SCHEMA_OBJECT_CHANGE_GROUP' `
-RetentionInDays 8 `
-AuditAction "DELETE ON schema::dbo BY [public]"
If you want to track DML operations also, you can use Temporal Tables but Database Auditing can help also:
Set-AzureRmSqlDatabaseAuditing -ResourceGroupName "resourceGroup"
-ServerName "SQL Server Name" -DatabaseName "AdventureWorksLT"
-StorageAccountName "storageAccount"
-AuditActionGroup "SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP", "FAILED_DATABASE_AUTHENTICATION_GROUP", "BATCH_COMPLETED_GROUP"
-AuditAction "UPDATE ON database::[AdventureWorksLT] BY [public]"
-RetentionInDays 60
I'm trying to clone an existing Azure SQL DB that's in an elastic pool to a standard SQL server in a different resource group. Whenever I run (with Az Powershell)
Restore-AzSqlDatabase -FromPointInTimeBackup -PointInTime (Get-Date) -ResourceGroupName $TargetRGName -ServerName $TargetServerName -TargetDatabaseName $TargetDBName -ResourceId $Database.ResourceID,
I get the error Long running operation failed with status 'Failed'. Additional Info:'An unexpected error occured while processing the request.
According to my script, you use Point-in-time restoration to restore your database. But we can not use the way to restore a database on the different servers. For more details, please refer to https://learn.microsoft.com/en-us/azure/sql-database/sql-database-recovery-using-backups#point-in-time-restore.
So if you want to restore the database on the different server, I suggest you use geo-store. If we use it, we can restore a SQL database on any server in any Azure region from the most recent geo-replicated backups. For further information, you read the official document. Regarding how to implement it by powershell, please refer to the following script
Connect-AzAccount
# get geo backup
$GeoBackup = Get-AzSqlDatabaseGeoBackup -ResourceGroupName "ResourceGroup01" -ServerName "Server01" -DatabaseName "Database01"
#restore database
Restore-AzSqlDatabase -FromGeoBackup -ResourceGroupName "TargetResourceGroup" -ServerName "TargetServer" -TargetDatabaseName "RestoredDatabase" -ResourceId $GeoBackup.ResourceID -Edition "Standard" -RequestedServiceObjectiveName "S2"
I have an Azure Powershell script that I have used to create a .bacpac file from an Azure SQL database. It works fine on a basic test database (AdventureWorks), but now that I am trying to use it on a database which contains encrypted stored procedures I am getting the following error:
Error SQL71564: Error validating element [dbo].[encryptedSPROCName]: The element [dbo].[encryptedSPROCName] cannot be
deployed as the script body is encrypted.
The stored procedures were created externally, so I do not have access to try to decrypt them.
The code block that I am using is:
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $ResourceGroupName -ServerName $ServerName `
-DatabaseName $CopyDatabaseName -StorageKeytype $StorageKeytype -StorageKey $StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $Administratorlogin -AdministratorLoginPassword $AdministratorLoginPassword
Is there a way using Powershell to omit the encrypted stored procedures when executing the backup?
Excluding Specific Object while export/import is not supported!!
However you can exclude certain object types using sqlpackage.exe ExcludeObjectType=StoredProcedures.
More options and info here - Export Parameters and Properties
Hope this helps!