Trigger from local sql database to azure sql db - azure

I have a local and an Azure database.
When a row is inserted in the local db, I would like it to insert it in the Azure one. I have created a trigger on the local db for that task:
USE [db_local]
Create trigger [dbo].[trigInverse] on [dbo].[db_local]
after insert
as
begin
insert into [serverazure.DATABASE.WINDOWS.NET].[db_azure].[Access].[Companies]([CompanyName])
select NAME
from inserted;
end
However, when I try to insert a row the error in picture1 appears
I cannot see what the parameter is, and how to set a correct value in that error message.
I did some tests to find the cause: I put that trigger between 2 local db, and tried adding rows to the source db, and it works
In the linked server node, these are the settings

You can use Azure SQL Data Sync, download and install in your local server SQL Azure Data Sync Agent
Then setup your azure data base like this:
Getting Started with Azure SQL Data Sync
It will sync your databases every 5 minutes.

Related

Azure DevOps CI/CD Pipelines for the Azure SQL Always Encrypted database Issues

During the Setup of the Azure DevOps CI/CD Pipelines for the Azure SQL Always Encrypted database,
Example :- Table1 consists of the 5 columns, out of the 5 column's Column1 and Column2 were encrypted
Always Enabled setting in Connection string
Dacpac file successfully created without any issues and able to view the Table1
Observed the issue with while inserting data in to Table1 by using transaction data
Error Message : Encryption scheme mismatch for columns/variables
Same code is working fine if execute this dacpac file manually in SSMS studio
Displaying error if use execute the dapac through SSDT or CI/CD Pipelines
Please let me know your thoughts about this issue?
Usually CI/CD pipeline with Dacpac working together is complex with Always encrypted enabled.Please check if the below points can narrow down the issue.
Usually the certificate for the Column Master Key is stored in the
client machine, not on the SQL server machine. If that is the case,
you are not able to insert data into the table with an Always
Encrypted column,Do the Master Key configuration .
(Hope you already knew but just for your info mismatch error in ssms can be solved this way)
According to permissions-for-publishing-a-dac-package-if-always-encrypted
To publish DAC package if Always Encrypted is set up in the DACPAC
or/and in the target database, you might need some or all of the below
permissions, depending on the differences between the schema in the
DACPAC and the target database schema.
ALTER ANY COLUMN MASTER KEY, ALTER ANY COLUMN ENCRYPTION KEY, VIEW ANY
COLUMN MASTER KEY DEFINITION, VIEW ANY COLUMN ENCRYPTION KEY
DEFINITION
Also note that Azure SQL is a PaaS Service which means it receives
update transparently and relatively often with a new compatibility
level. Try updating SSDT version . Always Encrypted is supported in
all editions of SQL Server Database V12.
Always Encrypted uses two types of cryptographic keys: column
encryption keys (CEKs) and column master keys (CMKs). see developing
databases using always encrypted
Please do variable declaration and value assignment are performed on
the same line.
Example:
DECLARE #OPERATION_ID int = 4
DECLARE #PARAMETER_NAME varchar(100) = 'xyz'
Try to store the value to be inserted in a variable or result and store in the application and then insert the data from the result set into SQL Server.
Also see
azure-encryption-server-side-client-side-azure-key-vault
create-and-store-column-master-keys-always-encrypted
ci-with-a-sql-always-encrypted-column

Has anyone been able to successfully execute SET RESULT_SET_CACHING on Azure SQL Warehouse?

I currently have an Azure SQL data warehouse and I'd like to enable caching so that intensive queries run faster in the database with the following code:
ALTER DATABASE [myDB]
SET RESULT_SET_CACHING ON;
However, no matter how I try to run this query I get the following error:
Msg 5058, Level 16, State 12, Line 3
Option 'RESULT_SET_CACHING' cannot be set in database 'myDB'.
I am running the query based on Azure's documentation here: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-database-transact-sql-set-options?view=azure-sqldw-latest
I have tried running this query both in the master database and in the underlying one called myDB. I have also tried using commands such as:
USE master
GO
With no avail. Has anyone had success in enabling caching on Azure? Please let me know!
Screenshot of error and command below:
https://i.stack.imgur.com/mEJIy.png
I tested and this command works well in my ADW dwleon, see the bellow screenshot:
Please make sure:
Login you Azure SQL data warehouse with SQL server Admin account.
Run this command in master db
Summary of the document:
To set the RESULT_SET_CACHING option, a user needs server-level
principal login (the one created by the provisioning process) or be a
member of the dbmanager database role.
Enable result set caching for a database:
--Run this command when connecting to the MASTER database
ALTER DATABASE [database_name]
SET RESULT_SET_CACHING ON;
Hope this helps.

Write Azure Table Storage - Different behaviour local and cloud

I've a simple Azure function that writes periodically some data into an Azure Table Storage.
var storageAccount = new CloudStorageAccount(new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials("mystorage","xxxxx"),true);
var tableClient = storageAccount.CreateCloudTableClient();
myTable = tableClient.GetTableReference("myData");
TableOperation insertOperation = TableOperation.Insert(data);
myTable.ExecuteAsync(insertOperation);
The code runs well locally in Visual Studio and all data is written correctly into the Azure located Table Storage.
But if I deploy this code 1:1 into Azure as an Azure function, the code also runs well without any exception and logging shows, it runs through every line of code.
But no data is written in the Table Storage - same name, same credentials, same code.
Is Azure blocking this connection (AzureFunc in Azure > Azure Table Storage) in some way in contrast to "Local AzureFunc > Azure Table Storage)?
Is Azure blocking this connection (AzureFunc in Azure > Azure Table
Storage) in some way in contrast to "Local AzureFunc > Azure Table
Storage)?
No, it's not azure which is blocking the connection or anything of that sort.
You have to await the table operation you are doing with ExecuteAsync as the control in program is moving without that method being completed. Change your last line of code to
await myTable.ExecuteAsync(insertOperation);
Take a look how here on Because this call is not awaited, the current method continues to run before the call is completed.
The problem was the rowkey:
I used DateTime.Now for the rowkey (since autoincrement values are not provided by table storage).
And my local format was "1.1.2019 18:19:20" while the server's format was "1/1/2019 ..."
And "/" seems not to be allowed in the rowkey string.
Now, formatting the DateTime string correct everything works fine.

Recieving CREATE USER error when attempt to import BACPAC from blob storage to ssms

I am attempting to backup and restore a database located in Azure SQL database via Azure blob storage. To do this I have run Export Data-Tier Application... on the selected database and successfully stored it in a blob container as a BACPAC file. Now I am trying to do the reverse and Import Data-Tier Application... to check the backup process functions correctly, however I receive the following error during the process:
Could not import package.
Error SQL72014: .Net SqlClient Data Provider: Msg 15063, Level 16,
State 1, Line 1 The login already has an account under a different
user name.
Error SQL72045: Script executation error. The executed script: CREATE
USER [username] FOR LOGIN [username];
(Microsoft.SqlServer.Dac)
This results in the Importing database, Importing package schema and data into database and Updating database operations failing, and the creation of an empty database.
I'm unsure where the login or creating a user becomes relevant in importing the database, do you know why this error is occuring?
Run this query on master. It will give you a list of logins that exist at the server level.
SELECT A.name as userName, B.name as login, B.Type_desc, default_database_name, B.*
FROM sys.sysusers A
FULL OUTER JOIN sys.sql_logins B
ON A.sid = B.sid
WHERE islogin = 1 and A.sid is not null
Run this on the database you want to export as bacpac for later import it on your SQL Server instance:
SELECT DB_NAME(DB_ID()) as DatabaseName, * FROM sys.sysusers
You need to remove logins on the database that you see exist at the server level (on the master database). After that try to export the database as bacpac and import it to your SQL Server instance.
If you don't want to remove those logins/users on your current SQL Azure database, copy it as a new Azure SQL, remove logins, export it, and then drop the copied database when finish.
If you want to restore the bacpac in Azure, use the Import option on the portal instead of SSMS.
Download the latest SSMS for the best user experience.

Can't access database in Azure Portal - "This resource was not found"

I have an SQL database in Azure that I use with my API. I can access the database from SQL Server Management Studio, and it seemto be alright - I can select data, make modifications and whatever. Although, I can't access it from Azure Portal.
It doesn't appear when I list all resources in the Azure portal
When I select the database server from Azure Portal, I can see it under the list of available databases. When I click the specific database, I get the following error message:
"This resource was not found, it may have been deleted. /subscriptions/aee8966e-5891-40fb-8fff-8c359f43baee/resourceGroups/TestResourceGroup/providers/Microsoft.Sql/servers/testdbserver/databases/testdb"
Any ideas?
Update 2018-05-16:
Jerry Liu suggested renaming the database, so I logged into SQL Server Management Studio, and this is what I found:
I wasn't allowed to rename any of the databases on the server using right click --> Rename. The option is disabled.
ALTER DATABASE carbonate_prod_180508 Modify Name = carbonate_renamed;
Running this command, I get the error message "The source database 'carbonate_prod_180508' does not exist.". But I do see the database listed in the Object Explorer.
I also tried renaming it using Azure CLI. I find the database using "az sql db list" with the resource group name and server name, but when i run "az sql db rename" I get an error message stating the database was not found.
If you have done some operation like delete and recreate with same name or just move to another resource group, please wait a while for the operation to work completely.
Or if you have done nothing related, please try to rename your database in Management Studio and visit portal to check whether it's "back".
How to Rename SQL DB
Two methods for you to try
Click on database name twice slowly, or click once and push F2, just like we rename a file locally on Windows.
Create query on master database to execute alter operation.
Wait for about 30 seconds and refresh portal.
If none of them works, I recommend you to new a support request to Azure Team. It's the last blade in your Azure sql server panel.

Resources