Using the Node sdk for AWS, I'm trying to use the credentials and permissions given by the IAM role that is attached to the EC2 instance that my Node application is running on.
According to the sdk documentation, that can be done using the EC2MetadataCredentials class to assign the configuration properties for the sdk.
In the file that I'm using the sdk in to access a DynamoDB instance, I have the configuration code:
import AWS from 'aws-sdk'
AWS.config.region = 'us-east-1'
AWS.config.credentials = new AWS.EC2MetadataCredentials({
httpOptions: { timeout: 5000 },
maxRetries: 10,
retryDelayOptions: { base: 200 }
})
const dynamodb = new AWS.DynamoDB({
endpoint: 'https://dynamodb.us-east-1.amazonaws.com',
apiVersion: '2012-08-10'
})
However, when I trying to visit the web application I always get an error saying:
Uncaught TypeError: d.default.EC2MetadataCredentials is not a constructor
Uncaught TypeError: _awsSdk2.default.EC2MetadataCredentials is not a constructor
Even though that is the exact usage from the documentation! Is there something small that I'm missing?
Update:
Removing the credentials and region definitions from the file result in another error that'll say:
Error: Missing region|credentials in config
I don't know if this is still relevant for you, but you do need to configure the EC2MetadataCredentials as it is not in the default ProviderChain ( search for new AWS.CredentialProviderChain([ in node_loader.js in the sdk).
It seems you might have an old version of aws_sdk as that code works for me:
import AWS from 'aws-sdk';
...
AWS.config.credentials = new AWS.EC2MetadataCredentials();
I was facing a similar issue where AWS SDK was not fetching credentials. According to the documentation, SDK should be able to automatically fetch the credentials.
If you configure your instance to use IAM roles, the SDK automatically selects the IAM credentials for your application, eliminating the need to manually provide credentials.
I was able to solve the issue by manually fetching the credentials, and providing them directly wherever required (For MongoDB Atlas in my case):
var AWS = require("aws-sdk");
AWS.config.getCredentials(function(err) {
if (err) console.log(err.stack);
// credentials not loaded
else {
console.log("Access key:", AWS.config.credentials.accessKeyId);
}
});
Source: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/global-config-object.html
Although, why SDK is not not doing it automatically is still mystery to me. I will update the answer once I figure it out.
Related
I'm using MongoDB Atlas to host my MongoDB database and I want to use the MONGODB-AWS authentication mechanism for authentication. When I'm trying it locally with my personal IAM user it works as it should, however when it runs in production I get the error MongoError: bad auth : aws sts call has response 403. I run my Node.js application inside an AWS EKS cluster and I have added the NodeInstanceRole used in EKS to MonogDB Atlas. I use fromNodeProviderChain() from AWS SDK v3 to get my secret access key and access key id and have verified that I indeed get credentials.
Code to get the MongoDB URI:
import { fromNodeProviderChain } from '#aws-sdk/credential-providers'
async function getMongoUri(config){
const provider = fromNodeProviderChain()
const awsCredentials = await provider()
const accessKeyId = encodeURIComponent(awsCredentials.accessKeyId)
const secretAccessKey = encodeURIComponent(awsCredentials.secretAccessKey)
const clusterUrl = config.MONGODB_CLUSTER_URL
return `mongodb+srv://${accessKeyId}:${secretAccessKey}#${clusterUrl}/authSource=%24external&authMechanism=MONGODB-AWS`
}
Do I have to add some STS permissions for the node instance role or are the credentials I get from fromNodeProviderChain() not the same as the node instance role?
I want to trigger a GCP cloud function from a simple nodejs app running locally.
Reading the documentation it should be simple:
run gcloud auth application-default login to write ADC to file used by client libraries.
use google-auth-library to get a http client to use to trigger the function.
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// Example: https://my-cloud-run-service.run.app/books/delete/12345
// const url = 'https://TARGET_HOSTNAME/TARGET_URL';
// Example (Cloud Functions): https://project-region-projectid.cloudfunctions.net/myFunction
const targetAudience = 'https://<REGION>-<PROJECTID>.cloudfunctions.net/<FUNCTIONNAME>';
const { GoogleAuth } = require('google-auth-library');
const auth = new GoogleAuth();
const payload = {"prop1": "prop1Value"};
async function request() {
const client = await auth.getIdTokenClient(targetAudience);
const resp = await client.request({ url: targetAudience, method: 'POST', data: payload });
console.info(`Resp status: ${resp.status}; resp.data: ${resp.data}`);
}
(async () => {
await request();
})();
My understanding was that the google-auth-library would pick up the ADC from the file setup from running gcloud auth application-default login and everything would work.
My user has permission to invoke GCP functions as I can trigger the function using CURL with the header -H "Authorization:bearer $(gcloud auth print-identity-token)" \
However when I run this, it doesn't get past the line:
const client = await auth.getIdTokenClient(targetAudience);
Failing with:
Cannot fetch ID token in this environment, use GCE or set the GOOGLE_APPLICATION_CREDENTIALS environment variable t
o a service account credentials JSON file.
Using PubSub library works fine so expect ADC does work just not sure what am I missing when trying to trigger the GCP function.
Am I using the google-auth-library correctly here ?
Thanks
As mentioned in the thread:
gcloud auth activate-service-account --key-file is only for "you"
running gcloud commands, it won’t be picked up by "applications" that
need GOOGLE_APPLICATION_CREDENTIALS. As you can see from Invoke a
Google Cloud Run from java or How to call Cloud Run from outside
of Cloud Run/GCP?, you either need to have the JSON key file of
Service Account, or have to be running inside a GCE/GKE/Cloud Run/App
Engine/GCF instance.
For this to work on your local environment, I recommend logging in
with gcloud auth application-default login command (this command is
meant to work as if you’ve set GOOGLE_APPLICATION_CREDENTIALS
locally).
If that doesn't work, as a last resort you can refactor your code to
pick up identity token from an environment variable (if set) while
working locally,
such as: $ export ID_TOKEN="$(gcloud auth print-identity-token -q)" $ ./your-app
To know more about how the code does it with a JSON key file,refer to the link and similar implementation there.
For more information you can refer to a similar thread stated as :
Give the default service account access rights to Workspace
resource(s) you're attempting to access.
Use the JSON key file you set
up locally already, to have the Cloud Function run as the same user as
is happening when you run locally.
Essentially do a hybrid where you create a new service account that ONLY has the permissions you want (instead of using the default
service account or your personal user, both of which might have far
more permissions then desired for safety/security), use a key file to
run the Cloud Function under that identity, and only give the desired
permissions to that service account.
I have a node js application running on EC2. After a certain operation, I want to stop the EC2.
I am using this function to stop EC2
const stopInstance = () => {
// set the region
AWS.config.update({
accessKeyId: "MY ACCESS KEY",
secretAccesskey: "SECRET KEY",
region: "us-east-1"
})
// create an ec2 object
const ec2 = new AWS.EC2();
// setup instance params
const params = {
InstanceIds: [
'i-XXXXXXXX'
]
};
ec2.stopInstances(params, function(err, data) {
if (err) {
console.log(err, err.stack); // an error occurred
} else {
console.log(data); // successful response
}
});
}
When I am running it from EC2, it's giving error
UnauthorizedOperation: You are not authorized to perform this operation.
But when I am running the same code, using the same key and secret from my local machine, It's working perfectly.
Permissions I have
This will be down to the permissions of the IAM user being passed into the script.
Firstly this error message indicates that an IAM user/role was successfully used in the request, but failed to have permissions so that can be ruled out.
Assuming a key and secret are being successfully (looks like hard coded) you would be looking at further restrictions within the policy (such as principal or .
If the key and secret are not hard coded but instead passed in as environment variables, perform some debug to output the string values and validate these are what you expect. If they do not get passed into the SDK then it may be falling back to an instance role that is attached.
As a point of improvement, generally when interacting with the AWS SDK/CLI from within AWS (i.e. on an EC2 instance) you should use a IAM role over an IAM user as this will lead to less API credentials being managed/rotated. An IAM role will rotate temporary credentials for you every few hours.
If the same credentials are working on local machine then it's probably not a permission issue, but just to further isolate the issue, you can try to run the AWS-GetCallerIdentity to check the credentials that are being used.
https://docs.aws.amazon.com/cli/latest/reference/sts/get-caller-identity.html
In case if this does not help, create a new user and try giving full admin access and then using the credentials to see if this get's resolved. This will confirm whether we are facing a permission issue or not.
when I try to run firebase functions with cloud vision API and test the functions. I get this error:
ERROR: { Error: 7 PERMISSION_DENIED: Cloud Vision API has not been
used in project 563584335869 before or it is disabled. Enable it by
visiting
https://console.developers.google.com/apis/api/vision.googleapis.com/overview?project=563584335869
then retry. If you enabled this API recently, wait a few minutes for
the action to propagate to our systems and retry.
I do not recognize this project number and I have already enabled the API with the project that I am using. I set the GOOGLE_APPLICATION_CREDENTIALS using the project with the enabled API. What is it that I'm doing wrong?
For those of you who are still having this issue here is what worked for me:
const client = new vision.ImageAnnotatorClient({
keyFilename: 'serviceAccountKey.json'
})
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly. Additionally, you can take a look on this link that contains a useful step-by-step guide to use Firebase functions with Vision API which includes the Vision object authentication code for Node.js.
Passing the path to the service account key in code example:
// Imports the Google Cloud client library.
const Storage = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
const storage = new Storage({
keyFilename: '/path/to/keyfile.json'
});
// Makes an authenticated API request.
storage
.getBuckets()
.then((results) => {
const buckets = results[0];
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
})
.catch((err) => {
console.error('ERROR:', err);
});
API has not been used in project 563584335869
If you clicked that link has been printed in a console, that guide you to this url.
https://console.developers.google.com/apis/api/vision.googleapis.com/overview?project=firebase-cli
So that project id means you used the project credential of 'firebase-cli' not yours.
If you tried to set the value of the environment, you can find the variables in your directory
~/.config/firebase/...credentials.json
And sometime it would be not replaced after you tried to override.
But, you can set your credential in code.
You can find the way of getting a credential in here.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
And the credential format is like this one.
{
"type": "service_account",
"project_id":
"private_key_id":
"private_key":
"client_email":
"client_id":
"auth_uri":
"token_uri":
"auth_provider_x509_cert_url":
"client_x509_cert_url":
}
I've faced exactly the same error what you have when I used another google API. and resolved that way including the credential inside code.
const textToSpeech = require("#google-cloud/text-to-speech")
const keyfile = require(".././my-project.json")
const config = {
projectId: keyfile.project_id,
keyFilename: require.resolve(".././my-project.json")
};
const TTS_Client = new textToSpeech.TextToSpeechClient(config)
I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"