How to authenticate with tokens in Nodejs to a private bucket in Cloud Storage - node.js

Usually in Python what I do, I get the application default credentials, I get the access token then I refresh it to be able to authenticate to a private environment.
Code in Python:
# getting the credentials and project details for gcp project
credentials, your_project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
#getting request object
auth_req = google.auth.transport.requests.Request();
print(f"Checking Authentication : {credentials.valid}")
print('Refreshing token ....')
credentials.refresh(auth_req)
#check for valid credentials
print(f"Checking Authentication : {credentials.valid}")
access_token = credentials.token
credentials = google.oauth2.credentials.Credentials(access_token);
storage_client = storage.Client(project='itg-ri-consumerloop-gbl-ww-dv',credentials=credentials)
I am entirely new to NodeJS, and I am trying to make the same thing.
My goal later is to create an app engine application that would expose an image that is found in a private bucket, so credentials are a must.
How it is done?

For authentication, you could rely on the default application credentials that are present within the GCP platform (GAE, Cloud Functions, VM, etc.). Then you could just run the following piece of code from the documentation:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('albums');
const file = bucket.file('my-existing-file.png');
In most circumstances, there is no need to explicitly use authentication packages since they are already executed underneath the google-cloud/storage package in Nodejs. The same holds for the google-cloud-storage package in Python. It could help to look at the source code of both packages on Github. For me, this really helped to understand the authentication mechanism.
When I develop code on my own laptop, that interacts with google cloud storage, I first tell the gcloud SDK what my credentials are and on which GCP project I am working. I use the following commands for this:
gcloud config set project [PROJECT_ID]
gcloud auth application-default login
You could also set DEFAULT_APPLICATION_CREDENTIALS as an environment variable that points to a credentials file. Then within your code, you could pass the project name when initializing the client. This could be helpful if you are running your code outside of GCP on another server for example.

Related

Application Default Credentials http trigger GCP function from local nodejs application

I want to trigger a GCP cloud function from a simple nodejs app running locally.
Reading the documentation it should be simple:
run gcloud auth application-default login to write ADC to file used by client libraries.
use google-auth-library to get a http client to use to trigger the function.
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// Example: https://my-cloud-run-service.run.app/books/delete/12345
// const url = 'https://TARGET_HOSTNAME/TARGET_URL';
// Example (Cloud Functions): https://project-region-projectid.cloudfunctions.net/myFunction
const targetAudience = 'https://<REGION>-<PROJECTID>.cloudfunctions.net/<FUNCTIONNAME>';
const { GoogleAuth } = require('google-auth-library');
const auth = new GoogleAuth();
const payload = {"prop1": "prop1Value"};
async function request() {
const client = await auth.getIdTokenClient(targetAudience);
const resp = await client.request({ url: targetAudience, method: 'POST', data: payload });
console.info(`Resp status: ${resp.status}; resp.data: ${resp.data}`);
}
(async () => {
await request();
})();
My understanding was that the google-auth-library would pick up the ADC from the file setup from running gcloud auth application-default login and everything would work.
My user has permission to invoke GCP functions as I can trigger the function using CURL with the header -H "Authorization:bearer $(gcloud auth print-identity-token)" \
However when I run this, it doesn't get past the line:
const client = await auth.getIdTokenClient(targetAudience);
Failing with:
Cannot fetch ID token in this environment, use GCE or set the GOOGLE_APPLICATION_CREDENTIALS environment variable t
o a service account credentials JSON file.
Using PubSub library works fine so expect ADC does work just not sure what am I missing when trying to trigger the GCP function.
Am I using the google-auth-library correctly here ?
Thanks
As mentioned in the thread:
gcloud auth activate-service-account --key-file is only for "you"
running gcloud commands, it won’t be picked up by "applications" that
need GOOGLE_APPLICATION_CREDENTIALS. As you can see from Invoke a
Google Cloud Run from java or How to call Cloud Run from outside
of Cloud Run/GCP?, you either need to have the JSON key file of
Service Account, or have to be running inside a GCE/GKE/Cloud Run/App
Engine/GCF instance.
For this to work on your local environment, I recommend logging in
with gcloud auth application-default login command (this command is
meant to work as if you’ve set GOOGLE_APPLICATION_CREDENTIALS
locally).
If that doesn't work, as a last resort you can refactor your code to
pick up identity token from an environment variable (if set) while
working locally,
such as: $ export ID_TOKEN="$(gcloud auth print-identity-token -q)" $ ./your-app
To know more about how the code does it with a JSON key file,refer to the link and similar implementation there.
For more information you can refer to a similar thread stated as :
Give the default service account access rights to Workspace
resource(s) you're attempting to access.
Use the JSON key file you set
up locally already, to have the Cloud Function run as the same user as
is happening when you run locally.
Essentially do a hybrid where you create a new service account that ONLY has the permissions you want (instead of using the default
service account or your personal user, both of which might have far
more permissions then desired for safety/security), use a key file to
run the Cloud Function under that identity, and only give the desired
permissions to that service account.

How can I integrate a credential json file with GCP bigquery with nodejs?

I have a file type of json. It is a credential file. I want to integrate with GCP bigquery and access to GCP bigquery using this credential file with Nodejs.
How can I do that?
How can integrate with GCP bigquery using credential file in nodejs?
How can I test the result of integration to test integration is valid or not?
You probably want the keyFilename attribute, unless I've misunderstood your question.
This GCP doc talks about authenticating using a service account key file.
So if your credentials file lived in /var/my_credentials.json (dumb path but whatever), your Node.js code would look something like this:
const {BigQuery} = require('#google-cloud/bigquery');
const options = {
keyFilename: '/var/my_credentials.json',
projectId: 'my_project',
};
const bigquery = new BigQuery(options);
Also consider: keep the contents of that credentials file in Google Secret Manager and use gcloud secrets versions access latest, dumping the output into a temporary json file local to the script, then remove the temporary json file after it's no longer needed by the script. No need to have credentials floating around on servers.

Azure SDK use CLI Creds or Managed Identity

When working with AWS, if you use aws configure to log in, you can use the AWS SDK without exposing credentials in any programming language from your local machine. If anything is running inside aws later (Lambda, EC2, whatever) the exact same code does use the resource assigned IAM Role without any configuration.
I try to get the same to work with Azure, I thought that the Azure.Identity.DefaultAzureCredential does do this. But I can't even run my code locally:
var blobServiceClient = new BlobServiceClient(storageUri, new DefaultAzureCredential());
var containerClient = await blobServiceClient.CreateBlobContainerAsync("test-container");
How can I get a BlobServiceClient that authenticates using the CLI creds on my local machine, and a managed identity if running inside an AppService.
In your scenario, as you used, the DefaultAzureCredential is the best choice along with the BlobServiceClient, but it does not use CLI credentials to authenticate.
To make it work, just set the Environment variables with AZURE_CLIENT_ID, AZURE_TENANT_ID, AZURE_CLIENT_SECRET of your service principal. In Azure, it uses the MSI to authenticate.
If you want to use CLI credentials to authenticate, there is AzureServiceTokenProvider, it can also access azure storage, but you could not use it along with BlobServiceClient, you need to get the access token with the resource https://storage.azure.com,
var azureServiceTokenProvider2 = new AzureServiceTokenProvider();
string accessToken = await azureServiceTokenProvider2.GetAccessTokenAsync("https://storage.azure.com").ConfigureAwait(false);
then use the access token to call Storge REST API, I think the first option is more convenient, to use which one, it is up to you.

How to keep google-cloud-auth.json securely in app.yaml as an environmental variable?

I'm new to deployment/securing keys, and I'm not sure how to securely store the google-cloud-auth.json (auth required for creating the API client) outside of source code to prevent leaking credentials.
I've currently secured my API keys and tokens in my app.yaml file specifying them as environmental variables which successfully work as expected and shown below.
accessruntime: nodejs10
env_variables:
SECRET_TOKEN: "example"
SECRET_TOKEN2: "example2"
However my google-cloud-auth.json is kept as its own file since the parameter used for creating the client requires a path string.
const {BigQuery} = require('#google-cloud/bigquery');
...
const file = "./google-cloud-auth.json";
// Creates a BigQuery client
const bigquery = new BigQuery({
projectId: projectId,
datasetId: datasetId,
tableId: tableId,
keyFilename: file
});
According to the Setting Up Authentication for Server to Server Production Applications:
GCP client libraries will make use of the ADC (Application Default Credentials) to find the credentials meant to be used by the app.
What ADC does is basically to check if the GOOGLE_APPLICATION_CREDENTIALS env variable is set with the path to a service account file.
In case the env variable is not set, ADC will use the default service account provided by App Engine.
With this information I can suggest a couple of solutions to provide these credentials safely:
If you require to use a specific service account, set the path to the file with the GOOGLE_APPLICATION_CREDENTIALS. This section explains how to do that.
If you are not a fan of moving credential files around, I would suggest trying to use the default service account provided by the App Engine.
I just created a new project and deployed a basic app by mixing these 2 guides:
BigQuery Client Libraries
Quickstart for Node.js in the App Engine Standard Environment
My app.yaml had nothing more than the runtime: nodejs10 line, and I was still able to query through the BigQuery client library, using the default service account.
This account comes with the Project/Editor role and you can add any additional roles you need.

Google Cloud Vision reverse image search fails on Azure App Service because GOOGLE_APPLICATION_CREDENTIALS file cannot be found

I am attempting to perform a Google reverse image search using Google Cloud Vision on an Azure app service web app.
I have generated a googleCred.json, which the Google client libraries use in order to construct API requests. Google expects it to be available from an environment variable named GOOGLE_APPLICATION_CREDENTIALS.
The Azure app service that runs the web app has settings that mimic environment variables for the Google client libraries. The documentation is here, and I have successfully set the variable here:
Furthermore, the googleCred.json file has been uploaded to the app service. Here is the documentation I followed to use FTP and FileZilla to upload the file:
Also, the file permissions are as open as they can be:
However, when I access the web app in the cloud, I get the following error message:
Error reading credential file from location D:\site\wwwroot\Statics\googleCred.json: Could not find a part of the path 'D:\site\wwwroot\Statics\googleCred.json'. Please check the value of the Environment Variable GOOGLE_APPLICATION_CREDENTIALS
What am I doing wrong? How can I successfully use the Google Cloud Vision API on an Azure web app?
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
// Imports the Google Cloud client library.
const Storage = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
const storage = new Storage({
keyFilename: '/path/to/keyfile.json'
});
// Makes an authenticated API request.
storage
.getBuckets()
.then((results) => {
const buckets = results[0];
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
})
.catch((err) => {
console.error('ERROR:', err);
});
I'm writing here since i can't comment, but at a quick glance, is the "D:" in the path necessary? I assume you uploaded the file to the app service so try with this value for the path "\site\wwwroot\Statics\googleCred.json"

Resources