Pub sub with REST API - the request is missing a valid API Key - node.js

I am using the following code to do a test publish to pubsub
var data = {
file: 'ciao',
content_type: 'image/png'
};
needle
.post('https://pubsub.googleapis.com/v1/projects/topic:publish', data, {
multipart: true
},
function(err, resp) {
if (err)
console.log('Error: ' + err.message);
else
console.log('OK.' + JSON.stringify(resp.body));
});
But I get the error
{"error":{"code":403,"message":"The request is missing a valid API key.","status":"PERMISSION_DENIED"}}
Do I need a service account authorized to PubSub? Any hint on how to solve this issue?

You will need to verify the credentials you are using and the account permissions that those credentials have.
One of the popular approach is to have a service-account.json file with the credential information and use it as an enviroment variable GOOGLE_APPLICATION_CREDENTIALS. You can get that file when creating a credential account for your pub/sub application. Examples on how to create that you can find it on this link under Setting up authentication for server to server production applications..
Now you also need to verify the permissions and roles you credential account have. For cloud pub/sub there are lot of roles, like roles/editor or roles/pubsub.editor for the scope of your test run. You can even use a sample called testing_permissions from the official documentation to test your access. For a full lists of permissions and roles please see this site.
For more details you can check the access and authentication page

Related

Setting up Google Drive API on NodeJS using a service account

I'm trying to connect to the Google Drive API with a NodeJS server using a service account. The goal is for the server to be able to authenticate as the service account, retrieve relevant files from a drive, and send them back to the user, without the user needing to log in to Google directly. This would allow me to control file access through my web app instead of having to manually share and unshare files through Drive. From my understanding of the Google Drive API, this should all be possible. The problem is that I can't even figure out how to authenticate my server. The server runs on an AWS EC2 instance. To clarify, I do not want the user to have to authenticate using the frontend interface.
I've followed the quickstart guide and set up a service account & key as instructed here, but upon creating the key as instructed in the second link, it doesn't look like I have the correct credentials.json file. The JSON file I get after generating a key on the Google Developer Console has the following object keys (values intentionally removed):
type, project_id, private_key_id, private_key, client_email, client_id, auth_uri, token_uri, auth_provider_x509_cert_url, client_x509_cert_url
The quickstart guide suggests that this file should contain client_secret and redirect_uris within some installed object (const {client_secret, client_id, redirect_uris} = credentials.installed;):
Attempting to run this index.js quickstart file causes an error to be thrown, since installed does not exist within credentials.json. Where can I generate the necessary credentials file? Or am I on the wrong track completely?
Posts like this reference a similar issue on an older version of the quickstart documentation, but the solutions here don't help since there isn't a client_secret key in my credentials file.
When I saw the showing keys of your credentials.json file, I understood that the file is the credential file of the service account. If my understanding is correct, when I saw your showing script, it seems that the script is for OAuth2. In this case, this script cannot be used for the service account. I thought that this is the reason for your current issue.
In order to use Drive API using the service account, how about the following sample script?
Sample script:
Before you use this script, please set credentialFilename of the service account. In this case, please include the path.
const { google } = require("googleapis");
const credentialFilename = "credentials.json";
const scopes = ["https://www.googleapis.com/auth/drive.metadata.readonly"];
const auth = new google.auth.GoogleAuth({keyFile: credentialFilename, scopes: scopes});
const drive = google.drive({ version: "v3", auth });
// This is a simple sample script for retrieving the file list.
drive.files.list(
{
pageSize: 10,
fields: "nextPageToken, files(id, name)",
},
(err, res) => {
if (err) return console.log("The API returned an error: " + err);
const files = res.data.files;
console.log(files);
}
);
When this script is run, as a sample script, the file list is retrieved from the Google Drive of the service account. So, please modify this for your actual situation.
This sample script uses https://www.googleapis.com/auth/drive.metadata.readonly as the scope. Please modify this for your actual situation.
Reference:
Google APIs Node.js Client

Authenticating a Google Cloud Function as a service account on other Google APIs

I have an HTTP-triggered function running on Google Cloud Functions, which uses require('googleapis').sheets('v4') to write data into a docs spreadsheet.
For local development I added an account via the Service Accounts section of their developer console. I downloaded the token file (dev-key.json below) and used it to authenticate my requests to the Sheets API as follows:
var API_ACCT = require("./dev-key.json");
let apiClient = new google.auth.JWT(
API_ACCT.client_email, null, API_ACCT.private_key,
['https://www.googleapis.com/auth/spreadsheets']
);
exports.myFunc = function (req, res) {
var newRows = extract_rows_from_my_client_app_request(req);
sheets.spreadsheets.values.append({
auth: apiClient,
// ...
resource: { values:newRows }
}, function (e) {
if (e) res.status(500).json({err:"Sheets API is unhappy"});
else res.status(201).json({ok:true})
});
};
After I shared my spreadsheet with my service account's "email address" e.g. local-devserver#foobar-bazbuzz-123456.iam.gserviceaccount.com — it worked!
However, as I go to deploy this to the Google Cloud Functions service, I'm wondering if there's a better way to handle credentials? Can my code authenticate itself automatically without needing to bundle a JWT key file with the deployment?
I noticed that there is a FUNCTION_IDENTITY=foobar-bazbuzz-123456#appspot.gserviceaccount.com environment variable set when my function runs, but I do not know how to use this in the auth value to my googleapis call. The code for google.auth.getApplicationDefault does not use that.
Is it considered okay practice to upload a private JWT token along with my GCF code? Or should I somehow be using the metadata server for that? Or is there a built-in way that Cloud Functions already can authenticate themselves to other Google APIs?
It's common to bundle credentials with a function deployment. Just don't check them into your source control. Cloud Functions for Firebase samples do this where needed. For example, creating a signed URL from Cloud Storage requires admin credentials, and this sample illustrates saving that credential to a file to be deployed with the functions.
I'm wondering if there's a better way to handle credentials? Can my
code authenticate itself automatically without needing to bundle a JWT
key file with the deployment?
Yes. You can use 'Application Default Credentials', instead of how you've done it, but you don't use the function getApplicationDefault() as it has been deprecated since this Q was posted.
The link above shows how to make a simple call using the google.auth.getClient API, providing the desired scope, and have it decide the credential type needed automatically. On cloud functions this will be a 'Compute' object, as defined in the google-auth-library.
These docs say it well here...
After you set up a service account, ADC can implicitly find your
credentials without any need to change your code, as described in the
section above.
Where ADC is Application Default Credentials.
Note that, for Cloud Functions, you use the App Engine service account:
YOUR_PROJECT_ID#appspot.gserviceaccount.com, as documented here. That is the one you found via the FUNCTION_IDENTITY env var - this rather tripped me up.
The final step is to make sure that the service account has the required access as you did with your spreadsheet.

Secure Google Cloud Functions http trigger with auth

I am trying out Google Cloud Functions today following this guide: https://cloud.google.com/functions/docs/quickstart
I created a function with an HTTP trigger, and was able to perform a POST request to trigger a function to write to Datastore.
I was wondering if there's a way I can secure this HTTP endpoint? Currently it seems that it will accept a request from anywhere/anyone.
When googling around, I see most results talk about securing things with Firebase. However, I am not using the Firebase service here.
Would my options be either let it open, and hope no one knows the URL endpoint (security by obscurity), or implement my own auth check in the function itself?
After looking into this further, and taking a hint from #ricka's answer, I have decided to implement an authentication check for my cloud functions with a JWT token passed in in the form of an Authorization header access token.
Here's the implementation in Node:
const client = jwksClient({
cache: true,
rateLimit: true,
jwksRequestsPerMinute: 5,
jwksUri: "https://<auth0-account>.auth0.com/.well-known/jwks.json"
});
function verifyToken(token, cb) {
let decodedToken;
try {
decodedToken = jwt.decode(token, {complete: true});
} catch (e) {
console.error(e);
cb(e);
return;
}
client.getSigningKey(decodedToken.header.kid, function (err, key) {
if (err) {
console.error(err);
cb(err);
return;
}
const signingKey = key.publicKey || key.rsaPublicKey;
jwt.verify(token, signingKey, function (err, decoded) {
if (err) {
console.error(err);
cb(err);
return
}
console.log(decoded);
cb(null, decoded);
});
});
}
function checkAuth (fn) {
return function (req, res) {
if (!req.headers || !req.headers.authorization) {
res.status(401).send('No authorization token found.');
return;
}
const parts = req.headers.authorization.split(' ');
if (parts.length != 2) {
res.status(401).send('Bad credential format.');
return;
}
const scheme = parts[0];
const credentials = parts[1];
if (!/^Bearer$/i.test(scheme)) {
res.status(401).send('Bad credential format.');
return;
}
verifyToken(credentials, function (err) {
if (err) {
res.status(401).send('Invalid token');
return;
}
fn(req, res);
});
};
}
I use jsonwebtoken to verify the JWT token, and jwks-rsa to retrieve the public key. I use Auth0, so jwks-rsa reaches out to the list of public keys to retrieve them.
The checkAuth function can then be used to safeguard the cloud function as:
exports.get = checkAuth(function (req, res) {
// do things safely here
});
You can see this change on my github repo at https://github.com/tnguyen14/functions-datastore/commit/a6b32704f0b0a50cd719df8c1239f993ef74dab6
The JWT / access token can be retrieved in a number of way. For Auth0, the API doc can be found at https://auth0.com/docs/api/authentication#authorize-client
Once this is in place, you can trigger the cloud function (if you have yours enabled with http trigger) with something like
curl -X POST -H "Content-Type: application/json" \
-H "Authorization: Bearer access-token" \
-d '{"foo": "bar"}' \
"https://<cloud-function-endpoint>.cloudfunctions.net/get"
I spent a day vexed over this same question three years later and the Google documentation was er, not very illustrative. For those that do not want to implement this in code(me), I outline below how to authenticate Cloud Functions using only the GCP Console. Following is an example that authenticates an HTTP Trigger to a new service account that is then scheduled to run in Cloud Scheduler. You can extend and generalize this further to suit other needs.
Assumptions:
1.You have already created a Cloud Function that uses HTTP and made it require authentication.
2.Your function works when you do Test Runs. This is important, you don't want to be solving two or more problems at once later.
3.You know how to get around the GCP Web browser console.
Steps
I suggest creating a new service account that will be used for the task of invoking the HTTP Cloud Function. Do this via GCP's "IAM & Admin" page. Go to "Services Accounts" then "Create New"
Name your new service account. A service account ID will be auto-generated based on the name you made. It will look like a GCP service account email. "#yourproject-name.iam.gserviceaccount.com. Copy this for later. Click the "Create" button to finish the new account creation.
On the next page, you need to select a role for the service account. Best practice to just run a function is "Cloud Functions Invoker". Click the "Continue" button. You can skip the 3rd part. (Grant users access to this service account)
Ok now lets add this new service account to the cloud function that needs to be secured. Go to the Cloud Function panel and check the box to the left of the name of the function. Then on the upper right of the same panel, click "Show Info Panel" - notice in the screen that authentication is required. (You must add from here, not the functions "Permissions" page - you can't add new members from there.)
Now add the service account as a new member. Paste the service account e-mail you copied earlier into the blank field in the red box. You must put in the email account, the name alone will not work. For "Role" - in the drop down, once again, select "Cloud Functions Invoker". Click Save.
Within the Cloud Function's properties there are the provided HTTP Triggers, copy yours and keep it handy for later.
Now go to the Google Cloud Scheduler and select a Schedule. (Or create one if you do not have one already. The screen below shows one already made.)
With the Schedule's box checked, click "Edit" and you'll be presented with the screen below. Select "Show More" at the bottom of the initial screen to see all fields. The important fields regarding permissions:
For "URL" - Paste in the trigger url you copied in step 6.
For "Auth Header" select OIDC token. These are managed by the GCP for your project and sufficient for authentication.
For "Service Account" paste in the same one from the steps above.
"Audience" will auto-fill, no need to put anything there.
When done, click "Update" or "Save" depending on your entry point.
Back in the Cloud Scheduler dashboard, run your function by clicking the "Run Now" button. If all went well, it should run and the status "Success" should appear. If not, check the logs to see what happened.
So now you know your authenticated Cloud Function works with the service account that was created for it. From here, you can do all kinds of things in the context of this service account as your projects demand.
As a check, be sure to paste the HTTP trigger URL into your browser to ensure it cannot run. You should get the following Forbidden:
You can set project-wide or per-function permissions outside the function(s), so that only authenticated users can cause the function to fire, even if they try to hit the endpoint.
Here's Google Cloud Platform documentation on setting permissions and authenticating users. Note that, as of writing, I believe using this method requires users to use a Google account to authenticate.
You should not "leave it open and hope no one knows". You can implement your own security check or you may want to try the Google Function Authorizer module (https://www.npmjs.com/package/google-function-authorizer).
It seems like there are currently 2 ways to secure a Google Cloud Function HTTP endpoint.
1) Use a hard to guess function name (ex: my-function-vrf55m6f5Dvkrerytf35)
2) Check for password/credentials/signed-request within the function itself (using a header or parameter)
Probably best to do both.
You can create custom authentication algorithm to verify the Client.
Check out the algorithm from; https://security.stackexchange.com/q/210085/22239
For what it's worth, it looks like some upgrades have been made, and Google Cloud Functions now support two types of authentication and authorization: Identity and Access Management (IAM) and OAuth 2.0. Documentation can be found here

Managing tokens a shared google account for file storage

I am creating an application in Node.js, hosted by Heroku. Using passport.js, I have already implemented sign-in authentication with a google account, integrated with a mySQL database.
In part of the application, I ask the applicant to upload a set of files. The way I would like to handle the uploading is through google drive APIs. Essentially, the user will be able to select files, which the back-end would then upload to a single-account google drive. Note: This will be a separate account from any of the applicant's accounts.
While I understand the process of uploading and retrieving the files, I am still unsure how the tokens work. From my research online I know:
Access tokens expire after a certain time
Refresh Tokens do not expire and are to be stored in the database
My question is how to manage these tokens in the backend. My current plan is to use the google oauth playground to get the access tokens for the app under the shared account. Then, every time I need to upload or access a file, I get a new access token using the refresh token, and then use that access token to do my API calls.
However, after doing some implementation testing, I have some confusion. I went through the Google Node.js Quickstart Guide and then modified the code to do a file-upload instead of a file reader. The modified code is below:
function fileUpload(auth) {
var drive = google.drive({version: 'v2'});
drive.files.insert({
auth: auth,
resource: {
title:'Test',
mimeType: 'text/plain'
},
media: {
mimeType: 'text/plain',
body: 'TEST'
}
}, function(err, response) {
if (err) {
console.log('The API returned an error ' + err);
return;
} else {
console.log('Inserted')
}
});
}
From my understanding, after the access token is expired, you cannot use it anymore. However, after I ran the code after the access token expired, it was still able to complete the process. Further, the access token did not change either. Hence, my confusion is how to manage these access tokens, specifically if I need to worry about access tokens expiring, or if they are valid once they are used once.

Accessing Google Directory API with NodeJs

I am having trouble accessing the Google Directory API using node. What I am hoping to do is create and remove users from Groups (and create and list groups and their users). In testing I have managed to access most of the APIs without trouble but the Directory has been impossible.
Firstly, is what I am trying to do even possible?
Secondly, if it is possible, here is a sample of my code; what am I missing?
var google = require('googleapis');
var googleAuth = require('google-oauth-jwt');
var request = require('google-oauth-jwt').requestWithJWT();
request({
url: 'https://www.googleapis.com/admin/directory/v1/groups?domain=mydomainname.com&customer=my_customer',
jwt: {
email: 'created-service-account#developer.gserviceaccount.com',
keyFile: './MyPemFile.pem',
scopes: [
'https://www.googleapis.com/auth/admin.directory.orgunit',
'https://www.googleapis.com/auth/admin.directory.device.chromeos',
'https://www.googleapis.com/auth/admin.directory.user',
'https://www.googleapis.com/auth/admin.directory.group',
'https://www.googleapis.com/auth/drive.readonly'
]}
}, function (err, res, body) {
if (err) console.log("Error", err);
console.log("BODY", JSON.parse(body));
});
I have created a project in the Developer Console. I have created a new clientId (Service Account). I am then presented with a p12 file, which I use openSSL to convert to a pem file (file path for this given in keyFile setting above). The clientId email address created is used in the email setting above.
I have granted the project access to the Admin SDK. I have then gone into Admin Console and in Security -> Advanced -> Manage API client access, I have granted the Service Account access to all the scopes requested in the above code.
Hope, this makes sense, it is difficult to describe the full process. Please comment if you have any questions or need clarity on anything.
When running this code I always get a 403, "Not Authorized to access this resource/api".
Am I using the correct methodology? It is difficult to follow the Google Documentation as not all of help files match the current menu system.

Resources