I am trying to connect snowflake using python connector but i am facing problem while connecting to a specific role. even i have access to that role at snowflake web application but i am not able to connect with that same role using python connector. i am only able to connect with PUBLIC role.
I am using following script:
conn = snowflake.connector.connect(
user=USER,
password=PASSWORD,
role=ROLE,
account=ACCOUNT,
warehouse=WAREHOUSE,
database=DATABASE,
schema=SCHEMA,
autocommit=False
)
I am getting following error:
DatabaseError: 250001 (08001): Failed to connect to DB:
account_name.east-us-2.azure.snowflakecomputing.com:443. Role
'ANALYST_ROLE' specified in the connect string does not exist or not
authorized. Contact your local system administrator, or attempt to
login with another role, e.g. PUBLIC.
Here are some hints:
In Snowflake Role names are case sensitive. Maybe you have some quoting issues in your original code, i.e. please check your '', "", lower- or uppercase or simply use the original role name without some quotation marks.
Maybe the role has not enough privileges to operate on your db/schemas and thus it says "not authorized".
You can try to add the role manually before: Account -> Roles
I caught the error, I just made a very slight mistake(single character mistake) in my role name. due to very large name, i could not see it. I am able to connect now.
the role name mentioned here is fake for security reason so don't assume that.
Related
I'm pretty new to Azure (and SQL for that matter). I've been trying to configure Elastic Jobs Agent with a few specific jobs that would run queries against some of my databases on the server.
For right now I am targeting a test database where I want to execute a simple select query. However, I can't create the job step because of the "can't reference the credential" error.
I'm not sure why the error is popping up. I have followed Use T-SQL to create and manage Elastic Database Jobs article and I created all of the credentials and logins as described there.
The one exception here is that the masterkey already exists so I didn't create that and I also did not create a separate server for my agent host DB as suggested in some of the tutorials. My agent host DB sits on the same server where my target databases are but I would not think that would be an issue.
I have successfully created a target group and a target group member which is the specific database on this server that I want to query. I have also created the job I want to use.
The problem happens when I try to run this
DECLARE #step_id1 INT, #job_version1 INT;
EXEC jobs.sp_add_jobstep
#job_name = N'Job1',
#step_id = #step_id1 OUTPUT,
#step_name = N'Step1',
#command = N'select * from table',
#credential_name = N'agentjobuser',
#target_group_name = N'TestTarget'
I am at a loss here, I have no idea why it's saying that the credential doesn't exist. I am using the sql server admin login so I should definitely have the permissions for it.
I tried to repro this and got the same error.
When SQL server username is given as credential name parameter in the sp_add_jobstep, the same error is reproduced.
Cannot reference the credential 'user', because it does not exist or you do not have permission.
Database scoped credential name which is created for SQL server user is given as value for the parameter #credential_name. It is executed successfully without error.
I am working on this MS Azure article: Connect to and manage Azure Synapse Analytics workspaces in Azure Purview. But in Grant permission to use credentials for external tables section of the article, when I run the following query, I get the error shown below:
SQL:
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[mySynapseWorkspace] TO [myPurviewAccountName];
Error:
mismatched input 'SCOPED' expecting ':'
Question: What may be a cause of the error, and how can we fix it?
I learn from a similar issue logged that you should be providing your credential name and username instead.
Try,
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[your_credential_name] TO [your_username];
I am trying to integrate Azure SQL Server with Zapier. To do so I need to set permissions for the Zapier IP address.
When I run the following query
GRANT SELECT, INSERT ON db_name.test TO "zapier"#"54.86.9.50" IDENTIFIED BY "Password";
I get the following error
SQL Error [102] [S0001]: Incorrect syntax near 'zapier'
I have already tried [username]#[ip address] and that did not work either.
Could someone help me resolve this? Thanks a lot!
Hi as it saying syntax error try giving single quotes 'zapier'
GRANT SELECT, INSERT ON db_name.test TO 'zapier'#'54.86.9.50' IDENTIFIED BY 'Password';
I am building a Node.js server to run queries against BigQuery. For security reasons, I want this server to be read only. For example, if I write a query with DROP, INSERT, ALTER, etc. statement my query should get rejected. However, something like SELECT * FROM DATASET.TABLE LIMIT 10 should be allowed.
To solve this problem, I decided to use a service account with "jobUser" level access. According to BQ documentation, that should allow me to run queries, but I shouldn't be able to "modify/delete tables".
So I created such a service account using the Google Cloud Console UI and I pass that file to the BigQuery Client Library (for Node.js) as the keyFilename parameter in the code below.
// Get service account key for .env file
require( 'dotenv' ).config()
const BigQuery = require( '#google-cloud/bigquery' );
// Query goes here
const query = `
SELECT *
FROM \`dataset.table0\`
LIMIT 10
`
// Creates a client
const bigquery = new BigQuery({
projectId: process.env.BQ_PROJECT,
keyFilename: process.env.BQ_SERVICE_ACCOUNT
});
// Use standard sql
const query_options = {
query : query,
useLegacySql : false,
useQueryCache : false
}
// Run query and log results
bigquery
.query( query_options )
.then( console.log )
.catch( console.log )
I then ran the above code against my test dataset/table in BigQuery. However, running this code results in the following error message (fyi: exemplary-city-194015 is my projectID for my test account)
{ ApiError: Access Denied: Project exemplary-city-194015: The user test-bq-jobuser#exemplary-city-194015.iam.gserviceaccount.com does not have bigquery.jobs.create permission in project exemplary-city-194015.
What is strange is that my service account (test-bq-jobuser#exemplary-city-194015.iam.gserviceaccount.com) has the 'Job User' role and the Job User role does contain the bigquery.jobs.create permission. So that error message doesn't make sense.
In fact, I tested out all possible access control levels (dataViewer, dataEditor, ... , admin) and I get error messages for every role except the "admin" role. So either my service account isn't correctly configured or #google-cloud/bigquery has some bug. I don't want to use a service account with 'admin' level access because that allows me to run DROP TABLE-esque queries.
Solution:
I created a service account and assigned it a custom role with bigquery.jobs.create and bigquery.tables.getData permissions. And that seemed to work. I can run basic SELECT queries but DROP TABLE and other write operations fail, which is what I want.
As the error message shows, your service account doesn't have permissions to create BigQuery Job
You need to grant it roles/bigquery.user or roles/bigquery.jobUser access, see BigQuery Access Control Roles, as you see in this reference dataViewer and dataEditor don't have Create jobs/queries, but admin does, but you don't need that
To do the required roles, you can follow the instructions in Granting Access to a Service Account for a Resource
From command line using gcloud, run
gcloud projects add-iam-policy-binding $BQ_PROJECT \
--member serviceAccount:$SERVICE_ACOUNT_EMAIL \
--role roles/bigquery.user
Where BQ_PROJECT is your project-id and SERVICE_ACOUNT_EMAIL is your service-account email/id
Or from Google Cloud Platform console search or add your service-account email/id and give it the required ACLs
I solved my own problem. To make queries you need both bigquery.jobs.create and bigquery.tables.getData permissions. The JobUser role has the former but not the latter. I created a custom role (and assigned my service account to that custom role) that has both permissions and now it works. I did this using the Google Cloud Console UI ( IAM -> Roles -> +Add ) then ( IAM -> IAM -> <set service account to custom role> )
How can I build connection string for connecting to Azure SQL Database using Azure AD account?
Currently, I am using the following but it does not seem to be correct. It does not use Authentication Type: Active Directory Password.
Part of PowerShell script I am using:
server=$Server.database.windows.net;initial catalog=$Databasename;Authentication=Active Directory Password;Integrated Security=False;uid=$Username;password=$Password
I really appreciate your help.
I managed to resolve the issue; It was actually the order of properties.
The following connection string worked
Data Source=$Server.database.windows.net;initial catalog=Master;User ID=$Username;password=$Password;authentication=Active Directory Password
However, this does not work
Data Source=$Server.database.windows.net;initial catalog=Master;authentication=Active Directory Password;User ID=$Username;password=$Password
The only difference is the order of "Authentication" tag.
I never thought order of properties matter in ConnectionString.
In the example, they use this format:
Data Source=n9lxnyuzhv.database.windows.net;
Authentication=Active Directory Password;
Initial Catalog=testdb;
UID=bob#contoso.onmicrosoft.com;
PWD=MyPassWord!";
So try to use PWD instead of Password.