I'm trying to connect to the Google Drive API with a NodeJS server using a service account. The goal is for the server to be able to authenticate as the service account, retrieve relevant files from a drive, and send them back to the user, without the user needing to log in to Google directly. This would allow me to control file access through my web app instead of having to manually share and unshare files through Drive. From my understanding of the Google Drive API, this should all be possible. The problem is that I can't even figure out how to authenticate my server. The server runs on an AWS EC2 instance. To clarify, I do not want the user to have to authenticate using the frontend interface.
I've followed the quickstart guide and set up a service account & key as instructed here, but upon creating the key as instructed in the second link, it doesn't look like I have the correct credentials.json file. The JSON file I get after generating a key on the Google Developer Console has the following object keys (values intentionally removed):
type, project_id, private_key_id, private_key, client_email, client_id, auth_uri, token_uri, auth_provider_x509_cert_url, client_x509_cert_url
The quickstart guide suggests that this file should contain client_secret and redirect_uris within some installed object (const {client_secret, client_id, redirect_uris} = credentials.installed;):
Attempting to run this index.js quickstart file causes an error to be thrown, since installed does not exist within credentials.json. Where can I generate the necessary credentials file? Or am I on the wrong track completely?
Posts like this reference a similar issue on an older version of the quickstart documentation, but the solutions here don't help since there isn't a client_secret key in my credentials file.
When I saw the showing keys of your credentials.json file, I understood that the file is the credential file of the service account. If my understanding is correct, when I saw your showing script, it seems that the script is for OAuth2. In this case, this script cannot be used for the service account. I thought that this is the reason for your current issue.
In order to use Drive API using the service account, how about the following sample script?
Sample script:
Before you use this script, please set credentialFilename of the service account. In this case, please include the path.
const { google } = require("googleapis");
const credentialFilename = "credentials.json";
const scopes = ["https://www.googleapis.com/auth/drive.metadata.readonly"];
const auth = new google.auth.GoogleAuth({keyFile: credentialFilename, scopes: scopes});
const drive = google.drive({ version: "v3", auth });
// This is a simple sample script for retrieving the file list.
drive.files.list(
{
pageSize: 10,
fields: "nextPageToken, files(id, name)",
},
(err, res) => {
if (err) return console.log("The API returned an error: " + err);
const files = res.data.files;
console.log(files);
}
);
When this script is run, as a sample script, the file list is retrieved from the Google Drive of the service account. So, please modify this for your actual situation.
This sample script uses https://www.googleapis.com/auth/drive.metadata.readonly as the scope. Please modify this for your actual situation.
Reference:
Google APIs Node.js Client
Related
I'm calling the Google Analytics Reporting API using google-api-nodejs-client to show the number of visits inside a blog.
This blog is hosted inside Google App Engine Standard Environment.
In development, I'm authenticating my API calls using the Application Default Credentials. I downloaded the JSON file with the credentials from the account service I created exclusively for analytics purposes, set the file to the Google_Application_Credentials environment variable and everything worked. I'm able to get the data from Google Analytics and display it in the website.
But this is not working in production. I suppose getClient() it's not getting the credentials in that environment.
Things to note: 1) I did not upload the downloaded JSON file with the credentials from the service account (I think it would be counter intuitive and unsafe to do that, and from what I understood in the docs, GCP is able to deal automatically with the API authentications);
const {google} = require("googleapis");
async function main () {
// This method looks for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS
// environment variables.
const auth = await google.auth.getClient({
// Scope of the analytics reporting,
// with only reading access.
scopes: 'https://www.googleapis.com/auth/analytics.readonly',
});
// Create the analytics reporting object
const analyticsreporting = await google.analyticsreporting({
version: 'v4',
auth: auth,
});
// Fetch the analytics reporting
const res = await analyticsreporting.reports.batchGet({...});
return res.data;
}
I already run out of options. Can someone help me with this?
This is a problem with the default scopes and application default credentials. By default, if you don't create a new service account, you are going to get 'application default credentials' from the GCE metadata service:
https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-nodejs
Those credentials usually only have the cloud-platform scope, and the set of scopes cannot be changed (as of today). To make this work, you have a few options.
You could create a new service account, download the service account key, and use the keyFile property in the getClient method options to reference the key. If you do it this way, the scopes you pass into getClient will be respected.
You could play with the scopes available to the service account under which your GAE application is running. I haven't personally tried that, but it theoretically should be possible.
Best of luck!
I am attempting to perform a Google reverse image search using Google Cloud Vision on an Azure app service web app.
I have generated a googleCred.json, which the Google client libraries use in order to construct API requests. Google expects it to be available from an environment variable named GOOGLE_APPLICATION_CREDENTIALS.
The Azure app service that runs the web app has settings that mimic environment variables for the Google client libraries. The documentation is here, and I have successfully set the variable here:
Furthermore, the googleCred.json file has been uploaded to the app service. Here is the documentation I followed to use FTP and FileZilla to upload the file:
Also, the file permissions are as open as they can be:
However, when I access the web app in the cloud, I get the following error message:
Error reading credential file from location D:\site\wwwroot\Statics\googleCred.json: Could not find a part of the path 'D:\site\wwwroot\Statics\googleCred.json'. Please check the value of the Environment Variable GOOGLE_APPLICATION_CREDENTIALS
What am I doing wrong? How can I successfully use the Google Cloud Vision API on an Azure web app?
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
// Imports the Google Cloud client library.
const Storage = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
const storage = new Storage({
keyFilename: '/path/to/keyfile.json'
});
// Makes an authenticated API request.
storage
.getBuckets()
.then((results) => {
const buckets = results[0];
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
})
.catch((err) => {
console.error('ERROR:', err);
});
I'm writing here since i can't comment, but at a quick glance, is the "D:" in the path necessary? I assume you uploaded the file to the app service so try with this value for the path "\site\wwwroot\Statics\googleCred.json"
I'm using Google Cloud Storage and have a few buckets that contain objects which are not shared publicly. Example in screenshot below. Yet I was able to retrieve file without supplying any service account keys or authentication tokens from a local server using NodeJS.
I can't access the files from browser via the url formats (which is good):
https://www.googleapis.com/storage/v1/b/mygcstestbucket/o/20180221164035-user-IMG_0737.JPG
https://storage.googleapis.com/mygcstestbucket/20180221164035-user-IMG_0737.JPG
However, when I tried using retrieving the file from NodeJS without credentials, surprisingly it could download the file to disk. I checked process.env to make sure there were no GOOGLE_AUTHENTICATION_CREDENTIALS or any pem keys, and also even did a gcloud auth revoke --all on the command line just to make sure I was logged out, and still I was able to download the file. Does this mean that the files in my GCS bucket is not properly secured? Or I'm somehow authenticating myself with the GCS API in a way I'm not aware?
Any guidance or direction would be greatly appreciated!!
// Imports the Google Cloud client library
const Storage = require('#google-cloud/storage');
// Your Google Cloud Platform project ID
const projectId = [projectId];
// Creates a client
const storage = new Storage({
projectId: projectId
});
// The name for the new bucket
const bucketName = 'mygcstestbucket';
var userBucket = storage.bucket(bucketName);
app.get('/getFile', function(req, res){
let fileName = '20180221164035-user-IMG_0737.JPG';
var file = userBucket.file(fileName);
const options = {
destination: `${__dirname}/temp/${fileName}`
}
file.download(options, function(err){
if(err) return console.log('could not download due to error: ', err);
console.log('File completed');
res.json("File download completed");
})
})
Client Libraries use Application Default Credentials to authenticate Google APIs. So when you don't explicitly use a specific Service Account via GOOGLE_APPLICATION_CREDENTIALS the library will use the Default Credentials. You can find more details on this documentation.
Based on your sample, I'd assume the Application Default Credentials were used for fetching those files.
Lastly, you could always run echo $GOOGLE_APPLICATION_CREDENTIALS (Or applicable to your OS) to confirm if you've pointed a service account's path to the variable.
Create New Service Account in GCP for project and download the JSON file. Then set environment variable like following:
$env:GCLOUD_PROJECT="YOUR PROJECT ID"
$env:GOOGLE_APPLICATION_CREDENTIALS="YOUR_PATH_TO_JSON_ON_LOCAL"
I have an HTTP-triggered function running on Google Cloud Functions, which uses require('googleapis').sheets('v4') to write data into a docs spreadsheet.
For local development I added an account via the Service Accounts section of their developer console. I downloaded the token file (dev-key.json below) and used it to authenticate my requests to the Sheets API as follows:
var API_ACCT = require("./dev-key.json");
let apiClient = new google.auth.JWT(
API_ACCT.client_email, null, API_ACCT.private_key,
['https://www.googleapis.com/auth/spreadsheets']
);
exports.myFunc = function (req, res) {
var newRows = extract_rows_from_my_client_app_request(req);
sheets.spreadsheets.values.append({
auth: apiClient,
// ...
resource: { values:newRows }
}, function (e) {
if (e) res.status(500).json({err:"Sheets API is unhappy"});
else res.status(201).json({ok:true})
});
};
After I shared my spreadsheet with my service account's "email address" e.g. local-devserver#foobar-bazbuzz-123456.iam.gserviceaccount.com — it worked!
However, as I go to deploy this to the Google Cloud Functions service, I'm wondering if there's a better way to handle credentials? Can my code authenticate itself automatically without needing to bundle a JWT key file with the deployment?
I noticed that there is a FUNCTION_IDENTITY=foobar-bazbuzz-123456#appspot.gserviceaccount.com environment variable set when my function runs, but I do not know how to use this in the auth value to my googleapis call. The code for google.auth.getApplicationDefault does not use that.
Is it considered okay practice to upload a private JWT token along with my GCF code? Or should I somehow be using the metadata server for that? Or is there a built-in way that Cloud Functions already can authenticate themselves to other Google APIs?
It's common to bundle credentials with a function deployment. Just don't check them into your source control. Cloud Functions for Firebase samples do this where needed. For example, creating a signed URL from Cloud Storage requires admin credentials, and this sample illustrates saving that credential to a file to be deployed with the functions.
I'm wondering if there's a better way to handle credentials? Can my
code authenticate itself automatically without needing to bundle a JWT
key file with the deployment?
Yes. You can use 'Application Default Credentials', instead of how you've done it, but you don't use the function getApplicationDefault() as it has been deprecated since this Q was posted.
The link above shows how to make a simple call using the google.auth.getClient API, providing the desired scope, and have it decide the credential type needed automatically. On cloud functions this will be a 'Compute' object, as defined in the google-auth-library.
These docs say it well here...
After you set up a service account, ADC can implicitly find your
credentials without any need to change your code, as described in the
section above.
Where ADC is Application Default Credentials.
Note that, for Cloud Functions, you use the App Engine service account:
YOUR_PROJECT_ID#appspot.gserviceaccount.com, as documented here. That is the one you found via the FUNCTION_IDENTITY env var - this rather tripped me up.
The final step is to make sure that the service account has the required access as you did with your spreadsheet.
I am having trouble accessing the Google Directory API using node. What I am hoping to do is create and remove users from Groups (and create and list groups and their users). In testing I have managed to access most of the APIs without trouble but the Directory has been impossible.
Firstly, is what I am trying to do even possible?
Secondly, if it is possible, here is a sample of my code; what am I missing?
var google = require('googleapis');
var googleAuth = require('google-oauth-jwt');
var request = require('google-oauth-jwt').requestWithJWT();
request({
url: 'https://www.googleapis.com/admin/directory/v1/groups?domain=mydomainname.com&customer=my_customer',
jwt: {
email: 'created-service-account#developer.gserviceaccount.com',
keyFile: './MyPemFile.pem',
scopes: [
'https://www.googleapis.com/auth/admin.directory.orgunit',
'https://www.googleapis.com/auth/admin.directory.device.chromeos',
'https://www.googleapis.com/auth/admin.directory.user',
'https://www.googleapis.com/auth/admin.directory.group',
'https://www.googleapis.com/auth/drive.readonly'
]}
}, function (err, res, body) {
if (err) console.log("Error", err);
console.log("BODY", JSON.parse(body));
});
I have created a project in the Developer Console. I have created a new clientId (Service Account). I am then presented with a p12 file, which I use openSSL to convert to a pem file (file path for this given in keyFile setting above). The clientId email address created is used in the email setting above.
I have granted the project access to the Admin SDK. I have then gone into Admin Console and in Security -> Advanced -> Manage API client access, I have granted the Service Account access to all the scopes requested in the above code.
Hope, this makes sense, it is difficult to describe the full process. Please comment if you have any questions or need clarity on anything.
When running this code I always get a 403, "Not Authorized to access this resource/api".
Am I using the correct methodology? It is difficult to follow the Google Documentation as not all of help files match the current menu system.