I am trying to come up with a solution/tool that can cycle through all our azure db servers and generate a report of USERS/PERMISSIONS for each db. Anyone have any ideas?
You can use the SMO library in Azure-SQL to loop through all Azure SQL databases on a logical Azure SQL server as explained here. You can then run the following query on each database:
SELECT DISTINCT pr.principal_id, pr.name, pr.type_desc,
pr.authentication_type_desc, pe.state_desc, pe.permission_name
FROM sys.database_principals AS pr
JOIN sys.database_permissions AS pe
ON pe.grantee_principal_id = pr.principal_id;
Related
I'm trying to configure an Azure pipeline where I create a copy of a production database to create a "pre-prod" environment.
After create that database I need to run some queries in the freshly created database. The problem is database is not available right away. As the process is automatic I need to know for how long I need to wait.
I put a wait step for 5 minutes but sometimes is not enough.
Thanks in advance
How about using a simple check of DB availability, through Az module, or CLI?
do {
sleep 120
$status = "Offline"
try{
$status = (Get-AzSqlDatabase -ResourceGroupName "resourcegroup01" -ServerName "server01"-DatabaseName "MyDataBase").Status
}
catch
{
"Database not available yet"
}
} while ($status -ne "Online")
You can try to use the Azure portal's query editor to query an Azure SQL Database to check DB availability.
The query editor is a tool in the Azure portal for running SQL queries against your database in Azure SQL Database or data warehouse in Azure Synapse Analytics.
Note: The query editor uses ports 443 and 1443 to communicate. Ensure you have enabled outbound HTTPS traffic on these ports. You also need to add your outbound IP address to the server's allowed firewall rules to access your databases and data warehouses.
For more query editor considerations, please refer to this.
I currently have an Azure SQL data warehouse and I'd like to enable caching so that intensive queries run faster in the database with the following code:
ALTER DATABASE [myDB]
SET RESULT_SET_CACHING ON;
However, no matter how I try to run this query I get the following error:
Msg 5058, Level 16, State 12, Line 3
Option 'RESULT_SET_CACHING' cannot be set in database 'myDB'.
I am running the query based on Azure's documentation here: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-database-transact-sql-set-options?view=azure-sqldw-latest
I have tried running this query both in the master database and in the underlying one called myDB. I have also tried using commands such as:
USE master
GO
With no avail. Has anyone had success in enabling caching on Azure? Please let me know!
Screenshot of error and command below:
https://i.stack.imgur.com/mEJIy.png
I tested and this command works well in my ADW dwleon, see the bellow screenshot:
Please make sure:
Login you Azure SQL data warehouse with SQL server Admin account.
Run this command in master db
Summary of the document:
To set the RESULT_SET_CACHING option, a user needs server-level
principal login (the one created by the provisioning process) or be a
member of the dbmanager database role.
Enable result set caching for a database:
--Run this command when connecting to the MASTER database
ALTER DATABASE [database_name]
SET RESULT_SET_CACHING ON;
Hope this helps.
I am getting this error No default service level objective found of edition "GeneralPurpose" in SSMS when creating database in Azure SQL
Please download the latest SQL Server Management Studio version from here. Version 18.0 has many fixes related to Azure Managed Instances.
It is a limitation of the free subscription you are using at this time. ""'Free Trial subscriptions can provision Basic, Standard S0 through S3 databases, up to 100 eDTU Basic or Standard elastic pools and DW100 through DW400 data warehouses"
You can also try to create the database using T-SQL as shown below.
CREATE DATABASE Testdb
( EDITION = 'Standard', SERVICE_OBJECTIVE = 'S3' );
GO
In my case it was because I had wrong connection string in my app settings(.NET).
To find your connection string you need to go to your db on azure and in overview you need to find "connection string".
User is getting below error while running bulk insert command in Azure SQL Server. I am using Azure SQL Server and not SQL Sever. Most of the commands related to Bulk Insert grant permission is not working in Azure SQL Server.
Error
You do not have permission to use the bulk load statement.
Commands Tried in Azure SQL Server to Add User
EXEC sp_addrolemember 'db_ddladmin', 'testuser';
ALTER SERVER ROLE [bulkadmin] ADD MEMBER testuser
GRANT ADMINISTER BULK OPERATIONS TO testuser
Error
Msg 40520, Level 16, State 1, Line 5
Securable class 'server' not supported in this version of SQL Server.
Your help is highly appreciated.
In Azure SQL Database, grant ADMINISTER DATABASE BULK OPERATIONS to the principal in the context of the desire database:
GRANT ADMINISTER DATABASE BULK OPERATIONS TO testuser;
The user will also need INSERT permissions on the target table. These Azure SQL Database permissions are detailed in the BULK INSERT documentation under the permissions section.
On Azure it works on tables in the database in question only. It does not work on temp tables. So if you are bulk loading in parallel and want to use temp tables, you are in a corner.
GRANT CONTROL to testuser
nothing else is needed, just this to be executed in content DB (not master)
full steps
in Master
CREATE LOGIN login1 WITH password='1231!#ASDF!a';
in content DB
CREATE USER user1 FROM LOGIN login1;
GRANT CONTROL to user1; --(this is for bulk to work)
trying to insert data into cosmo db but while trying to verify in connection string getting error as shown in figure,
After my test,I reproduce your issue when I try to configure the AccountName to be my cosmos db table api.
Based on this official statement, The UI based Data Migration tool (dtui.exe) is not currently supported for Table API accounts.
You could try to use command-line Azure Cosmos DB Data Migration tool (dt.exe).
If I connected to Cosmos DB SQL API, everything was fine.
Hope it helps you.
You should also pay attention to the domain in you "AccountEndpoint" string. Azure portal gives "AccountEndpoint" in "Connection String" section of "Azure Cosmos DB account" service and it looks like https://$YOUR_ACCOUNTNAME.table.cosmos.azure.com:443/. If you try to work with this endpoint in dt.exe - you'll get the described error. In case if you replace "table.cosmos.azure.com:443/" with "documents.azure.com:443/" - you'll pass the verifying successfully. This trick with domains and non-informative error have wasted 1 hour of my only life.