How can I integrate a credential json file with GCP bigquery with nodejs? - node.js

I have a file type of json. It is a credential file. I want to integrate with GCP bigquery and access to GCP bigquery using this credential file with Nodejs.
How can I do that?
How can integrate with GCP bigquery using credential file in nodejs?
How can I test the result of integration to test integration is valid or not?

You probably want the keyFilename attribute, unless I've misunderstood your question.
This GCP doc talks about authenticating using a service account key file.
So if your credentials file lived in /var/my_credentials.json (dumb path but whatever), your Node.js code would look something like this:
const {BigQuery} = require('#google-cloud/bigquery');
const options = {
keyFilename: '/var/my_credentials.json',
projectId: 'my_project',
};
const bigquery = new BigQuery(options);
Also consider: keep the contents of that credentials file in Google Secret Manager and use gcloud secrets versions access latest, dumping the output into a temporary json file local to the script, then remove the temporary json file after it's no longer needed by the script. No need to have credentials floating around on servers.

Related

Setting up Google Drive API on NodeJS using a service account

I'm trying to connect to the Google Drive API with a NodeJS server using a service account. The goal is for the server to be able to authenticate as the service account, retrieve relevant files from a drive, and send them back to the user, without the user needing to log in to Google directly. This would allow me to control file access through my web app instead of having to manually share and unshare files through Drive. From my understanding of the Google Drive API, this should all be possible. The problem is that I can't even figure out how to authenticate my server. The server runs on an AWS EC2 instance. To clarify, I do not want the user to have to authenticate using the frontend interface.
I've followed the quickstart guide and set up a service account & key as instructed here, but upon creating the key as instructed in the second link, it doesn't look like I have the correct credentials.json file. The JSON file I get after generating a key on the Google Developer Console has the following object keys (values intentionally removed):
type, project_id, private_key_id, private_key, client_email, client_id, auth_uri, token_uri, auth_provider_x509_cert_url, client_x509_cert_url
The quickstart guide suggests that this file should contain client_secret and redirect_uris within some installed object (const {client_secret, client_id, redirect_uris} = credentials.installed;):
Attempting to run this index.js quickstart file causes an error to be thrown, since installed does not exist within credentials.json. Where can I generate the necessary credentials file? Or am I on the wrong track completely?
Posts like this reference a similar issue on an older version of the quickstart documentation, but the solutions here don't help since there isn't a client_secret key in my credentials file.
When I saw the showing keys of your credentials.json file, I understood that the file is the credential file of the service account. If my understanding is correct, when I saw your showing script, it seems that the script is for OAuth2. In this case, this script cannot be used for the service account. I thought that this is the reason for your current issue.
In order to use Drive API using the service account, how about the following sample script?
Sample script:
Before you use this script, please set credentialFilename of the service account. In this case, please include the path.
const { google } = require("googleapis");
const credentialFilename = "credentials.json";
const scopes = ["https://www.googleapis.com/auth/drive.metadata.readonly"];
const auth = new google.auth.GoogleAuth({keyFile: credentialFilename, scopes: scopes});
const drive = google.drive({ version: "v3", auth });
// This is a simple sample script for retrieving the file list.
drive.files.list(
{
pageSize: 10,
fields: "nextPageToken, files(id, name)",
},
(err, res) => {
if (err) return console.log("The API returned an error: " + err);
const files = res.data.files;
console.log(files);
}
);
When this script is run, as a sample script, the file list is retrieved from the Google Drive of the service account. So, please modify this for your actual situation.
This sample script uses https://www.googleapis.com/auth/drive.metadata.readonly as the scope. Please modify this for your actual situation.
Reference:
Google APIs Node.js Client

How to authenticate with tokens in Nodejs to a private bucket in Cloud Storage

Usually in Python what I do, I get the application default credentials, I get the access token then I refresh it to be able to authenticate to a private environment.
Code in Python:
# getting the credentials and project details for gcp project
credentials, your_project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
#getting request object
auth_req = google.auth.transport.requests.Request();
print(f"Checking Authentication : {credentials.valid}")
print('Refreshing token ....')
credentials.refresh(auth_req)
#check for valid credentials
print(f"Checking Authentication : {credentials.valid}")
access_token = credentials.token
credentials = google.oauth2.credentials.Credentials(access_token);
storage_client = storage.Client(project='itg-ri-consumerloop-gbl-ww-dv',credentials=credentials)
I am entirely new to NodeJS, and I am trying to make the same thing.
My goal later is to create an app engine application that would expose an image that is found in a private bucket, so credentials are a must.
How it is done?
For authentication, you could rely on the default application credentials that are present within the GCP platform (GAE, Cloud Functions, VM, etc.). Then you could just run the following piece of code from the documentation:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('albums');
const file = bucket.file('my-existing-file.png');
In most circumstances, there is no need to explicitly use authentication packages since they are already executed underneath the google-cloud/storage package in Nodejs. The same holds for the google-cloud-storage package in Python. It could help to look at the source code of both packages on Github. For me, this really helped to understand the authentication mechanism.
When I develop code on my own laptop, that interacts with google cloud storage, I first tell the gcloud SDK what my credentials are and on which GCP project I am working. I use the following commands for this:
gcloud config set project [PROJECT_ID]
gcloud auth application-default login
You could also set DEFAULT_APPLICATION_CREDENTIALS as an environment variable that points to a credentials file. Then within your code, you could pass the project name when initializing the client. This could be helpful if you are running your code outside of GCP on another server for example.

How to keep google-cloud-auth.json securely in app.yaml as an environmental variable?

I'm new to deployment/securing keys, and I'm not sure how to securely store the google-cloud-auth.json (auth required for creating the API client) outside of source code to prevent leaking credentials.
I've currently secured my API keys and tokens in my app.yaml file specifying them as environmental variables which successfully work as expected and shown below.
accessruntime: nodejs10
env_variables:
SECRET_TOKEN: "example"
SECRET_TOKEN2: "example2"
However my google-cloud-auth.json is kept as its own file since the parameter used for creating the client requires a path string.
const {BigQuery} = require('#google-cloud/bigquery');
...
const file = "./google-cloud-auth.json";
// Creates a BigQuery client
const bigquery = new BigQuery({
projectId: projectId,
datasetId: datasetId,
tableId: tableId,
keyFilename: file
});
According to the Setting Up Authentication for Server to Server Production Applications:
GCP client libraries will make use of the ADC (Application Default Credentials) to find the credentials meant to be used by the app.
What ADC does is basically to check if the GOOGLE_APPLICATION_CREDENTIALS env variable is set with the path to a service account file.
In case the env variable is not set, ADC will use the default service account provided by App Engine.
With this information I can suggest a couple of solutions to provide these credentials safely:
If you require to use a specific service account, set the path to the file with the GOOGLE_APPLICATION_CREDENTIALS. This section explains how to do that.
If you are not a fan of moving credential files around, I would suggest trying to use the default service account provided by the App Engine.
I just created a new project and deployed a basic app by mixing these 2 guides:
BigQuery Client Libraries
Quickstart for Node.js in the App Engine Standard Environment
My app.yaml had nothing more than the runtime: nodejs10 line, and I was still able to query through the BigQuery client library, using the default service account.
This account comes with the Project/Editor role and you can add any additional roles you need.

Google Cloud Vision reverse image search fails on Azure App Service because GOOGLE_APPLICATION_CREDENTIALS file cannot be found

I am attempting to perform a Google reverse image search using Google Cloud Vision on an Azure app service web app.
I have generated a googleCred.json, which the Google client libraries use in order to construct API requests. Google expects it to be available from an environment variable named GOOGLE_APPLICATION_CREDENTIALS.
The Azure app service that runs the web app has settings that mimic environment variables for the Google client libraries. The documentation is here, and I have successfully set the variable here:
Furthermore, the googleCred.json file has been uploaded to the app service. Here is the documentation I followed to use FTP and FileZilla to upload the file:
Also, the file permissions are as open as they can be:
However, when I access the web app in the cloud, I get the following error message:
Error reading credential file from location D:\site\wwwroot\Statics\googleCred.json: Could not find a part of the path 'D:\site\wwwroot\Statics\googleCred.json'. Please check the value of the Environment Variable GOOGLE_APPLICATION_CREDENTIALS
What am I doing wrong? How can I successfully use the Google Cloud Vision API on an Azure web app?
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
// Imports the Google Cloud client library.
const Storage = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
const storage = new Storage({
keyFilename: '/path/to/keyfile.json'
});
// Makes an authenticated API request.
storage
.getBuckets()
.then((results) => {
const buckets = results[0];
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
})
.catch((err) => {
console.error('ERROR:', err);
});
I'm writing here since i can't comment, but at a quick glance, is the "D:" in the path necessary? I assume you uploaded the file to the app service so try with this value for the path "\site\wwwroot\Statics\googleCred.json"

Google Cloud Storage access without providing credentials?

I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"

Resources