Secure Google Cloud Functions http trigger with auth - security

I am trying out Google Cloud Functions today following this guide: https://cloud.google.com/functions/docs/quickstart
I created a function with an HTTP trigger, and was able to perform a POST request to trigger a function to write to Datastore.
I was wondering if there's a way I can secure this HTTP endpoint? Currently it seems that it will accept a request from anywhere/anyone.
When googling around, I see most results talk about securing things with Firebase. However, I am not using the Firebase service here.
Would my options be either let it open, and hope no one knows the URL endpoint (security by obscurity), or implement my own auth check in the function itself?

After looking into this further, and taking a hint from #ricka's answer, I have decided to implement an authentication check for my cloud functions with a JWT token passed in in the form of an Authorization header access token.
Here's the implementation in Node:
const client = jwksClient({
cache: true,
rateLimit: true,
jwksRequestsPerMinute: 5,
jwksUri: "https://<auth0-account>.auth0.com/.well-known/jwks.json"
});
function verifyToken(token, cb) {
let decodedToken;
try {
decodedToken = jwt.decode(token, {complete: true});
} catch (e) {
console.error(e);
cb(e);
return;
}
client.getSigningKey(decodedToken.header.kid, function (err, key) {
if (err) {
console.error(err);
cb(err);
return;
}
const signingKey = key.publicKey || key.rsaPublicKey;
jwt.verify(token, signingKey, function (err, decoded) {
if (err) {
console.error(err);
cb(err);
return
}
console.log(decoded);
cb(null, decoded);
});
});
}
function checkAuth (fn) {
return function (req, res) {
if (!req.headers || !req.headers.authorization) {
res.status(401).send('No authorization token found.');
return;
}
const parts = req.headers.authorization.split(' ');
if (parts.length != 2) {
res.status(401).send('Bad credential format.');
return;
}
const scheme = parts[0];
const credentials = parts[1];
if (!/^Bearer$/i.test(scheme)) {
res.status(401).send('Bad credential format.');
return;
}
verifyToken(credentials, function (err) {
if (err) {
res.status(401).send('Invalid token');
return;
}
fn(req, res);
});
};
}
I use jsonwebtoken to verify the JWT token, and jwks-rsa to retrieve the public key. I use Auth0, so jwks-rsa reaches out to the list of public keys to retrieve them.
The checkAuth function can then be used to safeguard the cloud function as:
exports.get = checkAuth(function (req, res) {
// do things safely here
});
You can see this change on my github repo at https://github.com/tnguyen14/functions-datastore/commit/a6b32704f0b0a50cd719df8c1239f993ef74dab6
The JWT / access token can be retrieved in a number of way. For Auth0, the API doc can be found at https://auth0.com/docs/api/authentication#authorize-client
Once this is in place, you can trigger the cloud function (if you have yours enabled with http trigger) with something like
curl -X POST -H "Content-Type: application/json" \
-H "Authorization: Bearer access-token" \
-d '{"foo": "bar"}' \
"https://<cloud-function-endpoint>.cloudfunctions.net/get"

I spent a day vexed over this same question three years later and the Google documentation was er, not very illustrative. For those that do not want to implement this in code(me), I outline below how to authenticate Cloud Functions using only the GCP Console. Following is an example that authenticates an HTTP Trigger to a new service account that is then scheduled to run in Cloud Scheduler. You can extend and generalize this further to suit other needs.
Assumptions:
1.You have already created a Cloud Function that uses HTTP and made it require authentication.
2.Your function works when you do Test Runs. This is important, you don't want to be solving two or more problems at once later.
3.You know how to get around the GCP Web browser console.
Steps
I suggest creating a new service account that will be used for the task of invoking the HTTP Cloud Function. Do this via GCP's "IAM & Admin" page. Go to "Services Accounts" then "Create New"
Name your new service account. A service account ID will be auto-generated based on the name you made. It will look like a GCP service account email. "#yourproject-name.iam.gserviceaccount.com. Copy this for later. Click the "Create" button to finish the new account creation.
On the next page, you need to select a role for the service account. Best practice to just run a function is "Cloud Functions Invoker". Click the "Continue" button. You can skip the 3rd part. (Grant users access to this service account)
Ok now lets add this new service account to the cloud function that needs to be secured. Go to the Cloud Function panel and check the box to the left of the name of the function. Then on the upper right of the same panel, click "Show Info Panel" - notice in the screen that authentication is required. (You must add from here, not the functions "Permissions" page - you can't add new members from there.)
Now add the service account as a new member. Paste the service account e-mail you copied earlier into the blank field in the red box. You must put in the email account, the name alone will not work. For "Role" - in the drop down, once again, select "Cloud Functions Invoker". Click Save.
Within the Cloud Function's properties there are the provided HTTP Triggers, copy yours and keep it handy for later.
Now go to the Google Cloud Scheduler and select a Schedule. (Or create one if you do not have one already. The screen below shows one already made.)
With the Schedule's box checked, click "Edit" and you'll be presented with the screen below. Select "Show More" at the bottom of the initial screen to see all fields. The important fields regarding permissions:
For "URL" - Paste in the trigger url you copied in step 6.
For "Auth Header" select OIDC token. These are managed by the GCP for your project and sufficient for authentication.
For "Service Account" paste in the same one from the steps above.
"Audience" will auto-fill, no need to put anything there.
When done, click "Update" or "Save" depending on your entry point.
Back in the Cloud Scheduler dashboard, run your function by clicking the "Run Now" button. If all went well, it should run and the status "Success" should appear. If not, check the logs to see what happened.
So now you know your authenticated Cloud Function works with the service account that was created for it. From here, you can do all kinds of things in the context of this service account as your projects demand.
As a check, be sure to paste the HTTP trigger URL into your browser to ensure it cannot run. You should get the following Forbidden:

You can set project-wide or per-function permissions outside the function(s), so that only authenticated users can cause the function to fire, even if they try to hit the endpoint.
Here's Google Cloud Platform documentation on setting permissions and authenticating users. Note that, as of writing, I believe using this method requires users to use a Google account to authenticate.

You should not "leave it open and hope no one knows". You can implement your own security check or you may want to try the Google Function Authorizer module (https://www.npmjs.com/package/google-function-authorizer).

It seems like there are currently 2 ways to secure a Google Cloud Function HTTP endpoint.
1) Use a hard to guess function name (ex: my-function-vrf55m6f5Dvkrerytf35)
2) Check for password/credentials/signed-request within the function itself (using a header or parameter)
Probably best to do both.

You can create custom authentication algorithm to verify the Client.
Check out the algorithm from; https://security.stackexchange.com/q/210085/22239

For what it's worth, it looks like some upgrades have been made, and Google Cloud Functions now support two types of authentication and authorization: Identity and Access Management (IAM) and OAuth 2.0. Documentation can be found here

Related

Pub sub with REST API - the request is missing a valid API Key

I am using the following code to do a test publish to pubsub
var data = {
file: 'ciao',
content_type: 'image/png'
};
needle
.post('https://pubsub.googleapis.com/v1/projects/topic:publish', data, {
multipart: true
},
function(err, resp) {
if (err)
console.log('Error: ' + err.message);
else
console.log('OK.' + JSON.stringify(resp.body));
});
But I get the error
{"error":{"code":403,"message":"The request is missing a valid API key.","status":"PERMISSION_DENIED"}}
Do I need a service account authorized to PubSub? Any hint on how to solve this issue?
You will need to verify the credentials you are using and the account permissions that those credentials have.
One of the popular approach is to have a service-account.json file with the credential information and use it as an enviroment variable GOOGLE_APPLICATION_CREDENTIALS. You can get that file when creating a credential account for your pub/sub application. Examples on how to create that you can find it on this link under Setting up authentication for server to server production applications..
Now you also need to verify the permissions and roles you credential account have. For cloud pub/sub there are lot of roles, like roles/editor or roles/pubsub.editor for the scope of your test run. You can even use a sample called testing_permissions from the official documentation to test your access. For a full lists of permissions and roles please see this site.
For more details you can check the access and authentication page

Azure Authentication Id is not stable

I am using Azure mobile app services with Xamarin Forms.
In my app, I use web social media authentication (Facebook, Twitter, Google) configured in the azure portal.
I am taking the sid gotten from CurrentClient.Id to match it with users in my Easy Tables. However, for some users, after logging in with the same account and same provider, no match is found in my database because the sid is different! I am 100% sure that it is the same account used to login before, yet I get a different sid. How is that possible? Shouldn't it remain the same with every login or what's the whole point of it then?
You are using Azure App Service Authentication for this. There is a stable ID that is available within the JWT that you pass to the service. You can easily get it from the /.auth/me endpoint (see https://learn.microsoft.com/en-us/azure/app-service/app-service-authentication-how-to#validate-tokens-from-providers )
When you GET /.auth/me with the X-ZUMO-AUTH header set to the authenticationToken returned from the login, the user.userId field will be populated with a stable ID. So, the next question is "how do I add this / compare this within the Node.js backend?" Fortunately, the HOW-TO FAQ for Node.js explicitly answers this. Short version is, use context.user.getIdentity() (an async method) to get the identity, then do something with it:
function queryContextFromUserId(context) {
return context.user.getIdentity().then((data) => {
context.query.where({ id: data.userId });
return context.execute();
});
}
function addUserIdToContext(context) {
return context.user.getIdentity().then((data) => {
context.itme.id = data.userId;
return context.execute();
});
}
table.read(queryContextFromUserId);
table.insert(addYserIdToContext);
table.update(queryContextFromUserId);
table.delete(queryContextFromUserId);
The real question here is "what is in the data block?" It's an object that contains "whatever the /.auth/me endpoint with the X-ZUMO-AUTH header produces", and that is provider dependent.
The mechanism to figure this out.
Debug your client application - when the login completes, inspect the client object for the CurrentUser and get the current token
Use Fiddler, Insomnia, or Postman to GET .../.auth/me with an X-ZUMO-AUTH header set to the current token
Repeat for each auth method you have to ensure you have the formats of each one.
You can now use these in your backend.

Azure EasyAuth: Getting Unauthorized error when try to login with a Microsoft account

This has been baffling me for hours now, so I have been trying to get EasyAuth working using different providers.
I am using this on Azure Functions, so let's say my function address is
https://xxx.azurewebsites.net
If I want to login into the service using a Google account I send my post request along with token received from Google to the following address
https://xxx.azurewebsites.net/.auth/login/google
This gives me a converted token back.
However if I do the same thing with a Microsoft account using the following details
Request Body:
{ "access_token": "token-string-value" }
Endpoint:
https://xxx.azurewebsites.net/.auth/login/microsoftaccount
It gives me the following error instead of a converted token
401 Unauthorized You do not have permission to view this directory or page.
--
I am using Msal JavaScript library to get my authentication token. Also I am testing these in Postman which makes it easy to understand what the problem is before I deal with the code and other stuff.
-- Update 1.0
This does seem like a bug, as even if I try to navigate to the
https://xxx.azurewebsites.net/.auth/login/microsoftaccount
It shows me the following
This URL works for other providers, Google, Facebook and Twitter. For all of them it redirects the user to the provider's login page.
According to the error page and the address bar contents, the client doesn't exist which could be referring to the application created on Azure to allow my website access the API. But everything has been setup correctly.
It would be helpful if someone from Azure We App Services can take a look at this.
I have created the Application and added the application ID and Secret int eh App Services page.
-- Update 2.0
So after hours of investigation, I managed to get the URL working, shockingly it was due to wrong information given on Azure portal. The link in Authorization and Authentication section of App Service is pointing to a new platform to register applications, which is purely for Azure AD based users.
For the external users to be able to login the application need to be registered in the following portal
https://apps.dev.microsoft.com
After registering the application here, and added the details in the App Service blade, the URL to EasyAuth is working.
However this doesn't resolve my issue. I still need a JavaScript library that gives me valid token which I can pass to EasyAuth endpoint.
Strangely the token taken from MSAL is not valid for Microsoft account. It just gives me the same error that my access is unauthorised. This means I probably need to use a different library to get a different token. I'd appreciate it if still someone can help me with this.
Below is a short sample code I am using to retrieve token and pass it to another function n which call EasyAuth endpoint and post the token along.
var applicationConfig = {
clientID: "xxxx-xxx-xxxx-xxxx",
authority: "https://login.microsoftonline.com/9fc1061d-5e26-4fd5-807e-bd969d857223",
graphScopes: ["user.read"],
graphEndpoint: "https://graph.microsoft.com/v1.0/me"
};
var myMSALObj = new Msal.UserAgentApplication(applicationConfig.clientID, applicationConfig.authority, acquireTokenRedirectCallBack,
{ storeAuthStateInCookie: true, cacheLocation: "localStorage" });
function signIn() {
myMSALObj.loginPopup(applicationConfig.graphScopes).then(function (idToken) {
//Login Success
acquireTokenPopupAndCallMSGraph();
}, function (error) {
console.log(error);
});
}
function signOut() {
myMSALObj.logout();
}
function acquireTokenPopupAndCallMSGraph() {
//Call acquireTokenSilent (iframe) to obtain a token for Microsoft Graph
myMSALObj.acquireTokenSilent(applicationConfig.graphScopes).then(function (accessToken) {
// accessToken
}, function (error) {
console.log(error);
});
}
I managed to find what was causing the problem.
So basically only Live Connect SDK generated tokens are valid on
https://xxx.azurewebsites.net/.auth/login/microsoftaccount
We were using MSAL which was generating tokens valid only on Azure Active Directory. I have been in touch with Azure Support, and have asked them to update the documentation. It currently is very confusing as none of these have been explained in the EasyAuth documentations.
We decided to go with Azure AD B2C, as it's more reliable and turns out cheaper for us.
In case anyone would like to use EasyAuth with Microsoft Account, the following is showing how to get access token from Live SDK
WL.Event.subscribe("auth.login", onLogin);
WL.init({
client_id: "xxxxxx",
redirect_uri: "xxxxxx",
scope: "wl.signin",
response_type: "token"
});
WL.ui({
name: "signin",
element: "signin"
});
function onLogin(session) {
if (!session.error) {
var access_token = session.session.access_token;
mobileClient.login('microsoftaccount', { 'access_token': access_token }, false)
.then(function () {
console.log('TODO - could enable/disable functionality etc')
}, function (error) {
console.log(`ERROR: ${error}`);
});
}
else {
console.log(`ERROR: ${session.error_description}`);
}
}
Reference to
< script src="//js.live.net/v5.0/wl.js">

GCP Consume a REST API after OAuth in Node.js

I am working to implement a Node.js webapp to be deployed on GCP App Engine.
Following the Node.js Bookshelf App sample, I did manage to implement a basic user authentication flow using the passport-google-oauth20 and retrieve basic profile information. I basically just got rid of what was not needed for my purposes
My custom code is available at: gist.github.com/vdenotaris/3a6dcd713e4c3ee3a973aa00cf0a45b0.
However, I would now like to consume a GCP Cloud Storage API to retrieve all the storage objects within a given buckets with the logged identity.
This should be possible by:
adding a proper scope for the request.
authenticating the REST requests using the user session token obtained via OAuth.
About the post-auth handler, the documentation says:
After you obtain credentials, you can store information about the
user. Passport.js automatically serializes the user to the session.
After the user’s information is in the session, you can make a couple
of middleware functions to make it easier to work with authentication.
// Middleware that requires the user to be logged in. If the user is not logged
// in, it will redirect the user to authorize the application and then return
// them to the original URL they requested.
function authRequired (req, res, next) {
if (!req.user) {
req.session.oauth2return = req.originalUrl;
return res.redirect('/auth/login');
}
next();
}
// Middleware that exposes the user's profile as well as login/logout URLs to
// any templates. These are available as `profile`, `login`, and `logout`.
function addTemplateVariables (req, res, next) {
res.locals.profile = req.user;
res.locals.login = `/auth/login?return=${encodeURIComponent(req.originalUrl)}`;
res.locals.logout = `/auth/logout?return=${encodeURIComponent(req.originalUrl)}`;
next();
}
But I do not see where the token is stored, how can I retrieve it and how to use it to consume a web-service (in my case, GCP storage).
I am not at all a node.js expert, so it would be nice having a bit more clarity on that: could someone explain me how to proceed in consuming a REST API using the logged user credentials (thus IAM/ACL privileges)?
If you want to access Cloud Storage through the use of a token obtained with OAuth, when the application requires user data, it will prompt a consent screen, asking for the user to authorize the app to get some of their data. If the user approves, an access token is generated, which can be attached to the user's request. This is better explained here.
If you plan to run your application in Google App Engine, there will be a service account prepared with the necessary authentication information, so no further setup is required. You may need to generate the service account credentials (generally in JSON format), that have to be added to the GOOGLE_APPLICATION_CREDENTIALS environment variable in gcloud.
Here is an example of how to authenticate and consume a REST API with the token that was obtained in the previous step. This, for example, would be a request to list objects stored in a bucket:
GET /storage/v1/b/example-bucket/o HTTP/1.1
Host: www.googleapis.com
Authorization: Bearer [YOUR_TOKEN]

Authenticating a Google Cloud Function as a service account on other Google APIs

I have an HTTP-triggered function running on Google Cloud Functions, which uses require('googleapis').sheets('v4') to write data into a docs spreadsheet.
For local development I added an account via the Service Accounts section of their developer console. I downloaded the token file (dev-key.json below) and used it to authenticate my requests to the Sheets API as follows:
var API_ACCT = require("./dev-key.json");
let apiClient = new google.auth.JWT(
API_ACCT.client_email, null, API_ACCT.private_key,
['https://www.googleapis.com/auth/spreadsheets']
);
exports.myFunc = function (req, res) {
var newRows = extract_rows_from_my_client_app_request(req);
sheets.spreadsheets.values.append({
auth: apiClient,
// ...
resource: { values:newRows }
}, function (e) {
if (e) res.status(500).json({err:"Sheets API is unhappy"});
else res.status(201).json({ok:true})
});
};
After I shared my spreadsheet with my service account's "email address" e.g. local-devserver#foobar-bazbuzz-123456.iam.gserviceaccount.com — it worked!
However, as I go to deploy this to the Google Cloud Functions service, I'm wondering if there's a better way to handle credentials? Can my code authenticate itself automatically without needing to bundle a JWT key file with the deployment?
I noticed that there is a FUNCTION_IDENTITY=foobar-bazbuzz-123456#appspot.gserviceaccount.com environment variable set when my function runs, but I do not know how to use this in the auth value to my googleapis call. The code for google.auth.getApplicationDefault does not use that.
Is it considered okay practice to upload a private JWT token along with my GCF code? Or should I somehow be using the metadata server for that? Or is there a built-in way that Cloud Functions already can authenticate themselves to other Google APIs?
It's common to bundle credentials with a function deployment. Just don't check them into your source control. Cloud Functions for Firebase samples do this where needed. For example, creating a signed URL from Cloud Storage requires admin credentials, and this sample illustrates saving that credential to a file to be deployed with the functions.
I'm wondering if there's a better way to handle credentials? Can my
code authenticate itself automatically without needing to bundle a JWT
key file with the deployment?
Yes. You can use 'Application Default Credentials', instead of how you've done it, but you don't use the function getApplicationDefault() as it has been deprecated since this Q was posted.
The link above shows how to make a simple call using the google.auth.getClient API, providing the desired scope, and have it decide the credential type needed automatically. On cloud functions this will be a 'Compute' object, as defined in the google-auth-library.
These docs say it well here...
After you set up a service account, ADC can implicitly find your
credentials without any need to change your code, as described in the
section above.
Where ADC is Application Default Credentials.
Note that, for Cloud Functions, you use the App Engine service account:
YOUR_PROJECT_ID#appspot.gserviceaccount.com, as documented here. That is the one you found via the FUNCTION_IDENTITY env var - this rather tripped me up.
The final step is to make sure that the service account has the required access as you did with your spreadsheet.

Resources