I have a azure sql database. Is it possible to find out who created the constraint on table? Or at least when it was added? If yes, how can I do that? Is there any scripts/tools for that purposes?
thanks in advance
Azure SQL has a feature named AUDITING. If enabled either on the server and/or database you can define a storage account to send the "Server Audit" and "Database Audit" logs to. In Azure storage, auditing logs are saved as a collection of blob files within a container named sqldbauditlogs. Use Power BI for example you can view audit log data.
If this features is not enabled your will struggle I think to find your user unless the database is accessed using Azure AD identities.
Please note Advanced Threat Detection will alert you on unusual access patterns. Least privilege approach to access is recommend.
Ref:
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-auditing
Maybe you can use below query to find out when the constraint created from all the SQL execution records.
SELECT TOP 1000
QS.creation_time,
SUBSTRING(ST.text,(QS.statement_start_offset/2)+1,
((CASE QS.statement_end_offset WHEN -1 THEN DATALENGTH(st.text)
ELSE QS.statement_end_offset END - QS.statement_start_offset)/2) + 1
) AS statement_text,
ST.text,
QS.total_worker_time,
QS.last_worker_time,
QS.max_worker_time,
QS.min_worker_time
FROM
sys.dm_exec_query_stats QS
CROSS APPLY
sys.dm_exec_sql_text(QS.sql_handle) ST
WHERE ST.text LIKE '%constraint_name%'
ORDER BY
QS.creation_time DESC
This query will take a few time.
Hope this helps.
If you enable Azure SQL Auditing you can try the following using PowerShell.
Set-AzureRmSqlDatabaseAuditing `
-State Enabled `
-ResourceGroupName "resourcegroupname" `
-ServerName "ssqlinstancename" ` #ssqlinstancename.database.windows.net
-StorageAccountName "strageaccountname" `
-DatabaseName "dbname" `
-AuditActionGroup 'SCHEMA_OBJECT_CHANGE_GROUP' `
-RetentionInDays 8 `
-AuditAction "CREATE ON schema::dbo BY [public]"
Related
I created an Azure SQL Database and configured geo-replication to a second server in a different region. In the Azure Portal, I can click on either of the databases, and see details about the regions being replicated to:
I want to use PowerShell to find this same information, but cannot find a cmdlet or property that exposes this information:
# Get database object
$database = Get-AzSqlDatabase -ResourceGroupName test-rg -ServerName testsql-eastus -DatabaseName TestDB
# Find if geo-replication is enabled?
The goal is to be able to pull all SQL databases in a subscription, and take different action on them depending if they have geo-replication enabled.
Please ref these document Get-AzSqlDatabaseFailoverGroup:
Gets a specific Azure SQL Database Failover Group or lists the
Failover Groups on a server. Either server in the Failover Group may
be used to execute the command. The returned values will reflect the
state of the specified server with respect to the Failover Group.
Example:
You can run Get-AzSqlDatabaseFailoverGroup -ResourceGroupName 'rg' -ServerName 'servername' to see if the databases in the Azure SQL server has configured geo-replication. If no failovergroup name return, then the database didn't enable the geo-replication.
Is there any way to get size of database in SQL Managed Instance in Azure using API for Azure management (Microsoft.Azure.Management.ResourceManager.Fluent.dll)?
Or maybe with another API?
The way of SQL queries is not possible for us, because we cannot connect to SQL Server directly.
Thank You.
You can use Managed Instances - Get API to retrieve information about an instance; which provide response property of size properties.storageSizeInGB.
Check this sample request for more reference.
we can certainly get details from Azure Metrics. I haven't tried for Managed Instance though.
Please check this: https://learn.microsoft.com/en-us/powershell/module/az.monitor/get-azmetric?view=azps-5.7.0
Something like this: (Got from https://fonsecasergio.wordpress.com/2019/03/27/how-to-get-azure-sql-database-size/)
$Databases = Get-AzureRmResource -ResourceGroupName "GROUPNAME" -ResourceType Microsoft.Sql/servers/databases
foreach ($DB in $Databases)
{
$DBSize = Get-TotalDatabaseSizeKb $DB
"DB ($($DB.Name)) $($DBSize)Kb or $($DBSize / 1024)Mb"
}
Try with this powershell:
Get-AzSqlInstance -ResourceGroupName "ResourceGroupOfYourSQLMI"
https://learn.microsoft.com/en-us/powershell/module/az.sql/get-azsqlinstance?view=azps-5.7.0
and this one:
Get-AzSqlInstanceDatabase -InstanceName "managedInstance1" -ResourceGroupName "resourcegroup01"
I'm new to Azure and I have been asked to generate a daily report of added/changed databases, table or columns in Azure SQL Server. Specifically, we want to know when new datbases are created, new tables are added or existing ones altered and the same for columns. In non-Azure SQL Server you can use the Schema Changes History in standard reports but this is not available in Azure.
I have seen some suggestions you can do this with Extended Events but I've not found any resources to show how this can be done. Any help would be appreciated.
DDL Triggers can really help you to keep track schema changes on your Azure SQL Database. Below is an example:
CREATE TRIGGER safety
ON DATABASE
FOR DROP_TABLE, ALTER_TABLE
AS
PRINT 'Save change on a log'
SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)');
Database Auditing can help with schema changes and dropped objects. Below an example:
Set-AzureRmSqlDatabaseAuditing `
-State Enabled `
-ResourceGroupName "resourcegroupname" `
-ServerName "ssqlinstancename" ` #ssqlinstancename.database.windows.net
-StorageAccountName "strageaccountname" `
-DatabaseName "dbname" `
-AuditActionGroup 'SCHEMA_OBJECT_CHANGE_GROUP' `
-RetentionInDays 8 `
-AuditAction "DELETE ON schema::dbo BY [public]"
If you want to track DML operations also, you can use Temporal Tables but Database Auditing can help also:
Set-AzureRmSqlDatabaseAuditing -ResourceGroupName "resourceGroup"
-ServerName "SQL Server Name" -DatabaseName "AdventureWorksLT"
-StorageAccountName "storageAccount"
-AuditActionGroup "SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP", "FAILED_DATABASE_AUTHENTICATION_GROUP", "BATCH_COMPLETED_GROUP"
-AuditAction "UPDATE ON database::[AdventureWorksLT] BY [public]"
-RetentionInDays 60
I am working on Microsoft Azure, in which I have a group of resources for a test environment and a production environment, in both I have an Azure SQL Databases database server with its respective database.
I am creating a Runbook of Automation Accounts in Powershell in another Microsoft Azure account (Important Note) that is responsible for "Copying" the production database to tests. I know that there is the New-AzSqlDatabaseCopy command, however, this command does not It works with Hyperscale databases.
Is there an alternative to this command in Hyperscale? or in this second account it is possible to create a. Bacpac remotely with Azure commands for Powershell, all I have seen are for working on the same account, but the database account is different from the automation account due to work rates.
Thank you in advance for your help and comments.
I already tried to use the New-AzureRmSqlDatabaseExport command, but it seems to work only in the same Azure Account, and I can't specify "Azure Account for backup" and "Azure account for storage". Am I right?
Like Alberto Morillo says in his comment New-AzSqlDatabaseCopy it's currently not available for Azure SQL HyperScale. at least at the moment of this answer.
So i try to use New-AzureRmSqlDatabaseExport with 2 Azure Accounts and it's tottally possible, you need to login with the Azure Account of the origin database Connect-AzureRmAccount then you need to call the New-AzureRmSqlDatabaseExport command with the following parameters.
New-AzureRmSqlDatabaseExport
-ResourceGroupName $RGName # Resource group of the source database
-ServerName $Server # Server name of the source database
-DatabaseName $Database # Name of the source database
-AdministratorLogin $User # Administrator user of the source database
-AdministratorLoginPassword $Pwd # Password of the source database
-StorageKeytype "StorageAccessKey" # Key type of the destination storage account (The one of the another azure account)
-StorageKey $StorageKey # Key of the destination storage account(The one of the another azure account)
-StorageUri $StorageFileFullURI # The full file uri of the destination storage (The one of the another azure account)
# The format of the URI file is the following:
# https://contosostorageaccount.blob.core.windows.net/backupscontainer/backupdatabasefile.bacpac
unfortunately, this command is not enabled for hyperscale, so I get the following error message:
New-AzureRmSqlDatabaseExport : 40822: This feature is not available for the selected database's edition (Hyperscale).
I used the same command with a database that was not Hyperscale and it worked perfectly.
Finally, I think I will have to perform the manual process for at least a few months, have Microsoft launch the update for HyperScale
Database copy is currently not available for Azure SQL Hyperscale but you may see it in public preview in a few months.
I am attempting to backup and restore a database located in Azure SQL database via Azure blob storage. To do this I have ran Export Data-Tier Application... on the selected database and successfully stored it in a blob container as a BACPAC file. Now I am trying to do the reverse and Import Data-Tier Application... to check the backup process functions correctly, however I receive the following error:
'The server principal "username" is not able to access database
"Database A" under the current security context'
The database it is referencing is the first listed database in the server I am trying to create a new database in which I don't have permissions for, and each time I give myself permission to access the first it just moves down the list to the next one and blocks me again. I don't understand why I need permissions for database A, B, C... to create a new database which is a copy of D, especially when I have the db_manager role so shouldn't have an issue with database creation.
Does this error indicate I am doing something wrong with the backup import, or is this a known issue and I need to have permissions for all databases in a server where I wish to import a backup to?
Please update SQL Server Management Studio (SSMS) to the latest version and try to export/import with the latest SSMS.
You can achieve the same using Azure portal. Open the appropriate database server page and then, on the toolbar, select Import database.
Select the storage account and the container for the BACPAC file and then select the BACPAC file from which to import. In addition, specify the new database size (usually the same as origin) and provide the destination SQL Server credentials.
You can also try with PowerShell.
$importRequest = New-AzSqlDatabaseImport
-ResourceGroupName "<your_resource_group>" `
-ServerName "<your_server>" `
-DatabaseName "<your_database>" `
-DatabaseMaxSizeBytes "<database_size_in_bytes>" `
-StorageKeyType "StorageAccessKey" `
-StorageKey $(Get-AzStorageAccountKey -ResourceGroupName "<your_resource_group>" -StorageAccountName "<your_storage_account").Value[0] `
-StorageUri "https://myStorageAccount.blob.core.windows.net/importsample/sample.bacpac" `
-Edition "Standard" `
-ServiceObjectiveName "P6" `
-AdministratorLogin "<your_server_admin_account_user_id>" `
-AdministratorLoginPassword $(ConvertTo-SecureString -String "<your_server_admin_account_password>" -AsPlainText -Force)