I am trying to connect AWS redshift in different account with AWS glue setup in another account. I have setup the cross account connectivity via IAM roles trust entity and its working fine.
I am able to get the Redshift cluster credentials via STS. But after creating the boto3 client for redshift using the temporary credentials and while creating the connection, getting below error.
InterfaceError: ('communication error', TimeoutError(110, 'Connection timed out')).
Below is my setup.
Below is my simple code.
assume_role_response=_get_sts_credentials()
if 'Credentials' in assume_role_response:
assumed_session = boto3.Session(
aws_access_key_id=assume_role_response['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_response['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_response['Credentials']['SessionToken'])
client = assumed_session.client('redshift')
logger.info('Getting temp Redshift credentials')
try:
redshift_temp_credentials = client.get_cluster_credentials(DbUser=redshift_user_name,
DbName=redshift_database,
ClusterIdentifier=redshift_cluster_id,
AutoCreate=False)
print('temp username is {} and password is {}'.format(redshift_temp_credentials['DbUser'],
redshift_temp_credentials['DbPassword']))
connection = redshift_connector.connect(host=redshift_host,
database=redshift_database,
user=redshift_temp_credentials['DbUser'],
password=redshift_temp_credentials['DbPassword'])
return connection
Related
I'm using MongoDB Atlas to host my MongoDB database and I want to use the MONGODB-AWS authentication mechanism for authentication. When I'm trying it locally with my personal IAM user it works as it should, however when it runs in production I get the error MongoError: bad auth : aws sts call has response 403. I run my Node.js application inside an AWS EKS cluster and I have added the NodeInstanceRole used in EKS to MonogDB Atlas. I use fromNodeProviderChain() from AWS SDK v3 to get my secret access key and access key id and have verified that I indeed get credentials.
Code to get the MongoDB URI:
import { fromNodeProviderChain } from '#aws-sdk/credential-providers'
async function getMongoUri(config){
const provider = fromNodeProviderChain()
const awsCredentials = await provider()
const accessKeyId = encodeURIComponent(awsCredentials.accessKeyId)
const secretAccessKey = encodeURIComponent(awsCredentials.secretAccessKey)
const clusterUrl = config.MONGODB_CLUSTER_URL
return `mongodb+srv://${accessKeyId}:${secretAccessKey}#${clusterUrl}/authSource=%24external&authMechanism=MONGODB-AWS`
}
Do I have to add some STS permissions for the node instance role or are the credentials I get from fromNodeProviderChain() not the same as the node instance role?
I am currently looking to create a pipeline that pushes data from a google sheet to an Azure Database table whenever the sheet gets edited.
Currently I am testing an Apps Script I am writing to connect to the database that I have set up to store the information however I keep getting errors, and am not sure what the issue in my connection string could be as the username and password are both correct.
I also have every ipv4 range allowed on the allow list in my firewall settings.
I have currently been working off of the example that Google provided, and this is the code that is currently giving me the error:
Failed with an error Failed to establish a database connection. Check connection string, username and password.
var conn = Jdbc.getConnection('jdbc:sqlserver://{server}.database.windows.net:1433;databaseName={database};user={username}#{server};password={password};');
I created azure SQL database in azure portal
and I added client Ip address to the server:
I try to connect with below connection string:
jdbc:sqlserver://<serverName>.database.windows.net:1433;database=<DatabaseName>;user=<userName>#<serberName>;password={your_password_here};encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;
Image for reference:
I got below error:
I tried with below code:
function AcessaVendas () {
var URL='jdbc:sqlserver://<serverName>.database.windows.net:1433'
var USER= '<userName>'
var PASS= '<password>'
var conn = Jdbc.getConnection(URL, USER, PASS);
}
I also try with clow code:
function myFunction() {
var conn = Jdbc.getConnection('jdbc:sqlserver://<serverName>.database.windows.net:1433;databaseName=<databaseName>;user=<userName>#<serverName>;password=<password>;');
}
First I got below error:
"Failed to establish a database connection. Check connection string, username and password."
I added fire wall rule with below Ip address:
"64.18.0.0 - 255.255.255.255"
After that My azure sql database is connected successfully.
Image for reference:
I'm trying to create an AWS client for IOT following this article: How can I publish to a MQTT topic in a Amazon AWS Lambda function?
client = boto3.client('iot-data', region_name='us-east-1')
However I need to set a profile so that boto3 picks the correct credentials from my ~/.aws/credentials file.
The articles that describe how to do this (How to choose an AWS profile when using boto3 to connect to CloudFront) use Session instead of creating a client. However iot-data is not a "resource" that you can get from Session.
boto_session = boto3.Session(profile_name='my-profile')
boto_client = boto_session.resource('iot-data', region_name='us-west-1')
When I try the above I get the error:
Consider using a boto3.client('iot-data') instead of a resource for 'iot-data'
And we've achieved full catch-22 status. How can I get an appropriate IOT client using an AWS profile?
IoTDataPlane does not have resource. You can only use client with the IoTDataPlane:
boto_session.client('iot-data', region_name='us-west-1')
I am trying to use my AD account to connect to the Azure SQL using Java 8, JDBC Driver, and my accessToken.
When I use Data Studio using my AD Account, I can connect successfully to the Azure SQL DB.
But when I use my Java Program then it gives me this error:
Request processing failed; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user ''
My code abstract:
SQLServerDataSource ds = new SQLServerDataSource();
ds.setServerName("NAME.database.windows.net");
ds.setDatabaseName("db-name");
ds.setAccessToken(accessToken);
ds.setEncrypt(true);
ds.setTrustServerCertificate(true);
try (Connection connection = ds.getConnection();
Statement stmt = connection.createStatement();
ResultSet rs = stmt.executeQuery("SELECT SUSER_SNAME()")) {
if (rs.next()) {
System.out.println("dbResults => You have successfully logged on as: " + rs.getString(1));
res = rs.getString(1);
}
}
After discussion in comments, we found out that we needed to change the scope used when getting the access token.
"User.Read.All" was specified, which is the short form "https://graph.microsoft.com/User.Read.All".
This means a Microsoft Graph API access token is returned, which won't work with Azure SQL DB.
Changing the scope to "https://database.windows.net/.default" resolved the issue.
This gets an access token for Azure SQL DB with the static permissions that the app registration has on Azure SQL DB.
Documentation: https://learn.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent
Here is the architecture:
I have a MongoDB running on an EC2. When I ssh to the EC2 and run mongo dbname -u username -p password I am able to perform insert into the collection so that should mean that the user has permission in dbname to perform insert on collection
On the other side I have an AWS lambda, VPC is configured. When using mongodb.MongoClient to connect I am authenticated and client.isConnected() returns true but db.collection().insertOne(...) raises an error saying that I'm not authorized to perform action insert on dbname.
The connection string used to connect contains the same username and password used to connect to mongo shell.
I tried using the same options for connection to mongodb, but used mongoose instead and it didn't raise any errors
what is the connection string used?
can you try to simulate what the lambda is doing locally? can you try connecting with local mongo client to your db using the same connection string that your lambda use?
are you connecting to the right db? is your user setup on your custom db or on the default admin db? are your update rights setup on the right db?