I was used Logic Apps to refresh reporting tables. Now I am trying to use Elastic Agent Job to do that because of long-running time queries.
I understand the concept of target member and job etc. However, I have an issue with the credentials. In my case, I want to refresh my tables with the views that I created in the same database daily.
When shoul I use job_credential or refresh_cedential? I think in my case, I need to use refresh_credential but I'm not sure. I tried to test first, so I create a job 'CreateTableTest' that exist in the document.
When I query select * from sys.symmetric_keys I see I have a credential that has a key_guid :'xxx'. So I created this:
create database scoped credential refresh_credential with identity ='refresh_credential', secret = 'xxx'
Error: Failed to connect to the target database: Login failed for user 'refresh_credential'.
How can I solve it?
Related
I followed this guide https://www.mssqltips.com/sqlservertip/5242/adding-users-to-azure-sql-databases/ to create a new user for my back-end API with restricted permissions for basic security reasons but can't make the back-end connect to the server. Every time it tries to connect I'm getting
System.Data.SqlClient.SqlException: 'Login failed for user 'xxxx'.'
I'm able to log-in this new user via SSMS by setting the target database in the login window options.
The back-end can connect just fine with the default connection string supplied by the Azure Portal, witch uses the server admin login. Changing the username and password for the new user, keeping the Initial Catalog to my desired database does not work.
I would assume the back-end would be able to access it since the Initial Catalog property of the connection string is set to the database the contained user was created on. But nothing is working.
This is my connection string used on my back-end:
Server=tcp:xxx.database.windows.net,1433;Initial Catalog=dbName;Persist
Security Info=False;User
ID=newUser;Password=newUserPw;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection
Timeout=30;
I tried many guides but none worked before I found this one that seems to be very knowledgeable about creating Azure SQL users, but even so no luck so far.
This are the commands I used to create the user on the DB I need it to connect(ofc with my own values):
-- select your db in dropdown and create a contained user
CREATE USER [test]
WITH PASSWORD = 'SuperSecret!',
DEFAULT_SCHEMA = dbo;
-- add user to role(s) in db
ALTER ROLE db_datareader ADD MEMBER [test];
ALTER ROLE db_datawriter ADD MEMBER [test];
Anyone knows whats going on? I don't want to have to use my admin login on my back-end.
Turns out the straight answer is to set
Persist Security Info=True;
in the connection string.
When username and password are included in connection string, security-sensitive information such as the password, is not returned as part of the connection if the connection is open or has ever been in an open state unless you set Persist Security info to true.
Reference: Persist Security Info , sqlconnectionstringbuilder
I am trying to build a small Web API that uses an Azure SQL Database. The database functionality is exposed via Stored Procedures and in order to call them, I am using Entity Framework for .NET, version 6.4.4 as it gives me access to the .edmx Data Model. I have properly configured the EF so that I can see the SPs and I've added one of them to my model. In my controller, I execute this piece of code:
var dbContext = new MyDbContext();
var result = dbContext.MyStoredProc();
I get an exception when the "MyStoredProc()" method gets called. The exception that is thrown is this:
For now, the connection string is configured to use SQL Authentication, so the user listed in the exception is the user that is configured in the connection string.
I can call the same SP no problems if I use SQL Server Management Studio, within a session open with the same user name and password as I have configured in the connection string.
Why is my application throwing up that exception? Are there specific privileges that I must elevate, in order for the connection to the database to be opened from my Web API?
Any help would be appreciated.
Cheers,
Eddie
In projects that use Entity Framework, an error message usually reports that there is no Login failed for user. At this time, we need to check connectionStrings in ʻApp.config`.
In order to help more forum users, it is recommended that everyone can refer to the following tutorial.
Entity Framework Tutorial
I am using Node library to integrate my application with BigQuery. I am planning to accept projectId, Email and private key from user and then I will validate credentials by making call to getDataset operation with limit 1 This will ensure that all 3 parameters passed by user are proper.
But then I realized that even if I pass different valid project ID, my call to getDataset passes. Operation gets datasets from that project. So I was wondering if Service account is not linked to project. Any idea how I can validate these three parameters ?
A service account key has some attributes inside it including project_id, private_key, client_email and many others. In this page you can find how to configure the credentials to use the client libraries.
Basically, the first step is creating a service account and download a JSON key (I suppose that you already completed this step)
Then you need to set an environment variable in your system so your application can access the credentials.
For Linux/Mac you can do that running:
export GOOGLE_APPLICATION_CREDENTIALS="[PATH]"
For Windows (using CMD):
set GOOGLE_APPLICATION_CREDENTIALS=[PATH]
I am trying to write a query that pulls from federated tables in BQ. In BQ I can run the query and get results. However, when I run the same query in Domo, I get the error: Domo is ready, but received a Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found..Please contact the data provider for support.
I have read all over the place that I need to change the scope to do this. I am not a developer though, so I am not sure exactly how to go about this in BQ.
Does anyone have step by step instructions for how to do this?
Thanks!
When you created federated table you grant access of account that run query on the file in e.g. Google Drive.
So when you run query in BQ Console - it used your credentials.
When you run from domo - it may use different account (probably some service account) - so to have everything work you should grant proper access (essentially share document with this account) to your drive file to this account.
I needed to move an azure project to a new account. My azure project consists of a SQL database and a mobile service connected to that database.
I moved the database by backing up the database into a .bacpac file and importing it in a new Azure account. I recreated the mobile service manually. Because the mobile service needed a different name, I changed the schema on the imported tables by running:
ALTER SCHEMA [NewSchema]
TRANSFER [OldSchema].[TableName]
Now after migration the SELECT works. In the azure portal I can see all the tables have rows, and the mobile service can read from the database. The problem however is in insert, and in Custom API's of the service.
Those seem to still work with the old schema in mind:
Insert: Error: [Microsoft][SQL Server Native Client 10.0][SQL Server]Invalid object name '[OldSchema].[TableName]'. (SqlState: 42S02, Code: 208)
CustomAPI: Error: [Microsoft][SQL Server Native Client 10.0][SQL Server]Invalid object name '[OldSchema].[TableName]'.
There are no references to the old schema that I can find anywhere. Not in the codebase, not in the database.
I even went and executed
SELECT * FROM sys.database_principals
alter user <userName> WITH DEFAULT_SCHEMA = [NewSchema]
for all users.
I am out of ideas. What am I missing, where is it referencing the old schema?
Thank you for your help!
EDIT: I have just tried printing SELECT SCHEMA_NAME() within the Custom API I am trying to execute. It printed the correct, new, schema name. When I printed SELECT CURRENT_USER the correct user is printed, with the correct default schema. I am now more confused than ever.
Can you check your database triggers? There's a trigger that runs on Insert/Update/Delete that may still have a reference to your old schema.