Windows Azure SQL dynamic scale did not work - azure

im trying to scale my Azure SQL DB with php. All the other sql statements works fine, but when im sending
ALTER DATABASE db1_abcd_efgh MODIFY (EDITION = 'Web', MAXSIZE=5GB);
i get an error like that
User must be in the master database.
My database URL is that
xaz25jze9d.database.windows.net
and the database is named linke that
db1_abcd_efgh
function skale_a_m(){
$host = "tcp:xaz25jze9d.database.windows.net,1433\sqlexpress";
$user = "db_user";
$pwd = "xxxxx?!";
$db = "master"; //I have tried out db1_abcd_efgh at this point
try {
$conn = new PDO("sqlsrv:Server= $host ; Database = $db ", $user, $pwd);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch (Exception $e) {
}
$string = 'use master; ALTER DATABASE db1_a_m MODIFY (EDITION =\'Web\', MAXSIZE=5GB)';
$stmt = $conn->query($string);
}
Now i have modified my function linke this
function skale_a_m() {
$serverName = "tcp:yq6ipq11b4.database.windows.net,1433";
$userName = 'db_user#yq6ipq11b4';
$userPassword = 'xxxxx?!';
$connectionInfo = array("UID" => $userName, "PWD" => $userPassword, "MultipleActiveResultSets" => true);
$conn = sqlsrv_connect($serverName, $connectionInfo);
if ($conn === false) {
echo "Failed to connect...";
}
$string = "ALTER DATABASE master MODIFY (EDITION ='Web', MAXSIZE=5GB)";
$stmt = sqlsrv_query($conn, $string);
}
Now i get no errors but the Db did not scale?

According to ALTER DATABASE (Windows Azure SQL Database), the ALTER DATABASE statement has to be issued when connected to the master database.
With PDO, this can be achieved by a connection string such as:
"sqlsrv:server=tcp:{$server}.database.windows.net,1433; Database=master"
Sample code:
<?php
function scale_database($server, $username, $password, $database, $maxsize) {
try {
$conn = new PDO ("sqlsrv:server=tcp:{$server}.database.windows.net,1433; Database=master", $username, $password);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$conn->setAttribute(constant('PDO::SQLSRV_ATTR_DIRECT_QUERY'), true);
$conn->exec("ALTER DATABASE {$database} MODIFY (MAXSIZE={$maxsize}GB)");
$conn = null;
}
catch (Exception $e) {
die(print_r($e));
}
}
scale_database("yourserver", "youruser", "yourpassword", "yourdatabase", "5");
?>
Note: It's not necessary to set the edition; it will be set according to the max size.
To test the sample code, configure it with your details (server name, login, password and database to be scaled) and execute it with PHP configured with the Microsoft Drivers 3.0 for PHP for SQL Server.
After that, refresh (Ctrl+F5) the Windows Azure Management Portal and you should see the new max size reflected on the Scale tab of the database.
You can also verify that it worked by using a tool to connect to the scaled database (not to the master database) and issuing this command:
SELECT CONVERT(BIGINT,DATABASEPROPERTYEX ('yourdatabase', 'MAXSIZEINBYTES'))/1024/1024/1024 AS 'MAXSIZE IN GB'

$string = 'use master; ALTER DATABASE db1_a_m MODIFY (EDITION =\'Web\', MAXSIZE=5GB)'
I'm pretty sure SQL Azure does not support switching Databases using the USE command.
Try connect directly to the master db in your connection, and remove the USE Master statement from the start of your query.
$host = "tcp:xaz25jze9d.database.windows.net,1433\sqlexpress";
That also looks wrong to me. You shouldn't have a named instance called SQLExpress at the end of your server connection afaik.

Related

Login error for admin SQL Server during terraform plan

I'm building an Azure infrastructure with terraform. I need to create a specific user of the DB for each DB in the server. To create the users I use the provider "betr-io / mssql", to create the users I use the following script:
resource "mssql_login" "sql_login" {
server {
host = "${var.sql_server_name}.database.windows.net"
# host = azurerm_mssql_server.sqlserver.fully_qualified_domain_name
login {
username = var.sql_admin_user
password = var.sql_admin_psw
}
}
login_name = var.sql_dbuser_username
password = var.sql_dbuser_password
depends_on = [azurerm_mssql_server.sqlserver, azurerm_mssql_database.sqldb]
}
resource "mssql_user" "sql_user" {
server {
host = "${var.sql_server_name}.database.windows.net"
# host = azurerm_mssql_server.sqlserver.fully_qualified_domain_name
login {
username = var.sql_admin_user
password = var.sql_admin_psw
}
}
username = var.sql_dbuser_username
password = var.sql_dbuser_password
database = var.sql_db_name
roles = var.sql_dbuser_roles
depends_on = [azurerm_mssql_server.sqlserver, azurerm_mssql_database.sqldb, mssql_login.sql_login]
}
What the terraform plan gives me is this error
Error: unable to read user [sqldb-dev].[dbuser]: login error: mssql: Login failed for user 'usr-admin'.
with mssql_user.sql_user,
on main.tf line 346, in resource "mssql_user" "sql_user":
346: resource "mssql_user" "sql_user" {
I can't understand the problem where it might come from, has anyone had a similar experience?
For completeness of information, the databases are hosted in an elastic pool instance.
The only solution I have found is to destroy the users and recreate them with the databases.
Unfortunately I haven't found a way to add devops to the sql server whitelist.

How do I list databases on SQL Azure server using SMO when not all databases have the user used to connect to the server?

SMO's Server.Databases will throw an exception if one database on the server does not have the SQL User I've used to create the SMO server connection with.
I don't get this same error to occur if using a local SQL 2017 instance.
For example:
I have a SQL Azure server named 'myazuresqlserver.database.windows.net'
Create a login on master:
CREATE LOGIN testlogin WITH PASSWORD = 'ThisIsMySecretPassword'
Create 2 databases, DB1 and DB2.
Create the User 'testuser' in DB1 but NOT DB2:
CREATE USER testuser FOR LOGIN testlogin
Then, using C# and SMO I cannot get a list of databases for that server using the credentials for the testlogin login - an exception occurs.
static void Main(string[] args)
{
string[] names = GetDatabaseNames("myazuresqlserver.database.windows.net");
}
public static string[] GetDatabaseNames(string serverName)
{
ServerConnection connection = new ServerConnection(serverName, "testuser", "ThisIsMySecretPassword");
var server = new Server(connection);
return (from Database database in server.Databases
where !database.IsSystemObject && !database.IsDatabaseSnapshot
select database.Name
).ToArray();
}
The exception is:
SqlException: The server principal "testlogin" is not able to access the database "DB2" under the current security context.
Cannot open database "DB2" requested by the login. The login failed. Login failed for user 'testlogin'.
I would have expected no exception to be thrown, and instead the Server.Databases to only contain databases which the credentials provided were valid - this seems to be the behaviour for a local SQL 2017 instance.
I came across the same issue. It's as if attempting to look at Server.Databases at all causes enumeration to occur, and in doing so database properties are being looked at - properties that the user may not have permission to interrogate. An exception is thrown immediately.
I ended up using a different technique to get a list of databases. I simply fired a sql query:
SELECT name FROM sys.databases WHERE name NOT IN ('master', 'tempdb', 'model', 'msdb');
For the above query to work, the user must have permission to access the master database. Here is my code block:
Dim Server As Server = New Server(New ServerConnection("myserver.server.net", "testlogin", "password"))
Dim databaseList As New DataTable
databaseList.Columns.Add("Name")
If Server.Edition = "SQL Azure" Then
Dim SQLReader As SqlClient.SqlDataReader = Nothing
Try
SQLReader = Server.ConnectionContext.ExecuteReader("SELECT name FROM sys.databases WHERE name NOT IN ('master', 'tempdb', 'model', 'msdb');")
Do While SQLReader.Read = True
databaseList.Rows.Add(Manager.Database.Sanitise(SQLReader, "name").ToString)
Loop
SQLReader.Close()
Catch ex As Exception
MsgBox(ex.Message)
Finally
If Not SQLReader Is Nothing Then
SQLReader.Close()
End If
End Try
End If

connect to local AD domain controller

I've set up a development server with AD and I'm trying to figure out how to connect to it via .NET. I'm working on the same machine that AD is installed on. I've gotten the DC name from AD, and the name of the machine, but the connection just does not work. I'm using the same credentials I used to connect to the server.
Any suggestions?
DirectoryEntry directoryEntry = new DirectoryEntry("LDAP://[dc.computername.com]", "administrator", "[adminpwd]");
Can you connect to the RootDSE container?
DirectoryEntry rootDSE = new DirectoryEntry("LDAP://RootDSE", "administrator", "[adminpwd]");
If that works, you can then read out some of the properties stored in that root container
if (rootDSE != null)
{
Console.WriteLine("RootDSE Properties:\n\n");
foreach (string propName in rootDSE.Properties.PropertyNames)
{
Console.WriteLine("{0:-20d}: {1}", propName, rootDSE.Properties[propName][0]);
}
}
This will show you some information about what LDAP paths are present in your installation.
Try something like this, using the System.DirectoryServices.Protocols namespace :
//Define your connection
LdapConnection ldapConnection = new LdapConnection("123.456.789.10:389");
try
{
//Authenticate the username and password
using (ldapConnection)
{
//Pass in the network creds, and the domain.
var networkCredential = new NetworkCredential(Username, Password, Domain);
//Since we're using unsecured port 389, set to false. If using port 636 over SSL, set this to true.
ldapConnection.SessionOptions.SecureSocketLayer = false;
ldapConnection.SessionOptions.VerifyServerCertificate += delegate { return true; };
//To force NTLM\Kerberos use AuthType.Negotiate, for non-TLS and unsecured, use AuthType.Basic
ldapConnection.AuthType = AuthType.Basic;
ldapConnection.Bind(networkCredential);
}
catch (LdapException ldapException)
{
//Authentication failed, exception will dictate why
}
}

Multiple databases(datacontext) on same server without MS DTC

I'm using EF5.0 with SQL server 2008. I have two databases on the same server instance. I need to update tables on both databases and want them to be same transaction. So I used the TransactionScope. Below is the code -
public void Save()
{
var MSObjectContext = ((IObjectContextAdapter)MSDataContext).ObjectContext;
var AWObjectContext = ((IObjectContextAdapter)AwContext).ObjectContext;
using (var scope = new TransactionScope(TransactionScopeOption.Required,
new TransactionOptions
{
IsolationLevel = IsolationLevel.ReadUncommitted
}))
{
MSObjectContext.SaveChanges(SaveOptions.DetectChangesBeforeSave);
AWObjectContext.SaveChanges(SaveOptions.DetectChangesBeforeSave);
scope.Complete();
}
}
When I use the above code Transaction gets promoted to DTC. After searching on internet I found that this happens because of two different connectionstrings / connections. But what I dont understand is if I write a stored procedure on one database which updates table in a different database (on same server) no DTC is required. Then why EF or TransactionScope is promoting this to DTC? Is there any other work around for this?
Please advise
Thanks in advance
Sai
With plain DbConnections, you can prevent DTC escalation for multiple databases on the same server by using the same connection string (with any database you like) and manually change the database on the opened connection object like so:
using (var tx = new TransactionScope())
{
using (var conn = new SqlConnection(connectStr))
{
conn.Open();
new SqlCommand("INSERT INTO atest VALUES (1)", conn).ExecuteNonQuery();
}
using (var conn = new SqlConnection(connectStr))
{
conn.Open();
conn.ChangeDatabase("OtherDB");
new SqlCommand("INSERT INTO btest VALUES (2)", conn).ExecuteNonQuery();
}
tx.Complete();
}
This will not escalate to DTC, but it would, if you used different values for connectStr.
I'm not familiar with EF and how it manages connections and contexts, but using the above insight, you might be able to avoid DTC escalation by doing a conn.ChangeDatabase(..) and then creating your context like new DbContext(conn, ...).
But please note that even with a shared connect string, as soon as you have more than one connection open at the same time, the DTC will get involved, like in this modified example:
using (var tx = new TransactionScope())
{
using (var conn = new SqlConnection(mssqldb))
{
conn.Open();
new SqlCommand("INSERT INTO atest VALUES (1)", conn).ExecuteNonQuery();
using (var conn2 = new SqlConnection(mssqldb))
{
conn2.Open();
conn2.ChangeDatabase("otherdatabase");
new SqlCommand("INSERT INTO btest VALUES (2)", conn2).ExecuteNonQuery();
}
}
tx.Complete();
}

Impersonate google user with a service account

I'm using google-api-php-client 0.6.1 and I'd like to know is there a way to impersonate concrete user with service account? My application needs to store some files in its google drive. So, I've decided to user service account and .p12 key - authentification. It works great, but all files are being stored in Service account, so I can't manage them. I'd like documents to be stored at the certain account (which was using to create the api project and the service account itself). I was trying to use this code:
$KEY_FILE = <p12 key file path>;
$key = file_get_contents($KEY_FILE);
$auth = new Google_AssertionCredentials(
$SERVICE_ACCOUNT_NAME,
array('https://www.googleapis.com/auth/drive'),
$key);
$auth->prn = '<certainuser#gmail.com>';
$client = new Google_Client();
$client->setUseObjects(true);
$client->setAssertionCredentials($auth);
return new Google_DriveService($client);
but I got "Error refreshing the OAuth2 token, message: '{ "error" : "access_denied" }'"
Don't user $auth->prn, use $auth->sub. This works for me:
// Create a new google client. We need this for all API access.
$client = new Google_Client();
$client->setApplicationName("Google Group Test");
$client_id = '...';
$service_account_name = '...';
$key_file_location = '...';
if (isset($_SESSION['service_token'])) {
$client->setAccessToken($_SESSION['service_token']);
}
$key = file_get_contents($key_file_location);
// https://www.googleapis.com/auth/admin.directory.group,
// https://www.googleapis.com/auth/admin.directory.group.readonly,
// https://www.googleapis.com/auth/admin.directory.group.member,
// https://www.googleapis.com/auth/admin.directory.group.member.readonly,
// https://www.googleapis.com/auth/apps.groups.settings,
// https://www.googleapis.com/auth/books
$cred = new Google_Auth_AssertionCredentials(
$service_account_name,
array(
Google_Service_Groupssettings::APPS_GROUPS_SETTINGS,
Google_Service_Directory::ADMIN_DIRECTORY_GROUP,
Google_Service_Directory::ADMIN_DIRECTORY_GROUP_READONLY,
Google_Service_Directory::ADMIN_DIRECTORY_GROUP_MEMBER,
Google_Service_Directory::ADMIN_DIRECTORY_GROUP_MEMBER_READONLY,
Google_Service_Books::BOOKS,
),
$key,
'notasecret'
);
//
// Very important step: the service account must also declare the
// identity (via email address) of a user with admin priviledges that
// it would like to masquerade as.
//
// See: http://stackoverflow.com/questions/22772725/trouble-making-authenticated-calls-to-google-api-via-oauth
//
$cred->sub = '...';
$client->setAssertionCredentials($cred);
if ($client->getAuth()->isAccessTokenExpired()) {
$client->getAuth()->refreshTokenWithAssertion($cred);
}
$_SESSION['service_token'] = $client->getAccessToken();

Resources