NodeJS: Google Sign-In for server-side apps - node.js

Following this documentation I succeed to perform Google Sign-In for server-side apps and have access to user's GoogleCalendar using Python on server side. I fail to do that with NodeJS.
Just in a nutshell - with Python I used the auth_code I've sent from the browser and got the credentials just like that:
from oauth2client import client
credentials = client.credentials_from_clientsecrets_and_code(
CLIENT_SECRET_FILE,
['https://www.googleapis.com/auth/drive.appdata', 'profile', 'email'],
auth_code)
Then I could store in DB the value of:
gc_credentials_json = credentials.to_json()
And generate credentials (yes it uses the refresh_token alone when it's needed):
client.Credentials.new_from_json(gc_credentials_json)
So I want to do the same using NodeJS:
easily generate credentials using just: CLIENT_SECRET_FILE, scopes and auth_code (just like I did with Python)
receive credentials using previous credentials value without analysing if the access token is expired - I prefer a ready (well tested by the community) solution
Thank you in advance!

I've implemented it using google-auth-library package.
Here is the function to retrieve the gcClient:
const performAuth = async () => {
const tokens = await parseTokenFromDB();
const auth = new OAuth2Client(
downloadedCredentialsJson.web.client_id,
downloadedCredentialsJson.web.client_secret
);
auth.on('tokens', async (newTokens) => {
updateDBWithNewTokens(newTokens);
});
await auth.setCredentials(tokens);
const gcClient = google.calendar({version: 'v3', auth});
return gcClient;
};
Here is the template for parseTokenFromCurrentDB function just to give the idea of its output:
const parseTokenFromCurrentDB = async () => {
// Put here your code to retrieve from DB the values below
return {
access_token,
token_type,
refresh_token,
expiry_date,
};
};
So using this implementation one can get gcClient:
const gcClient = await gc.getGcClient(org);
and use its methods, e.g.:
const gcInfo = await gc.getInfo(gcClient);
const events = await gc.getEvents(gcClient, calcPeriodInGcFormat());

Related

How do I call Google Analytics Admin API (for GA4) using an OAuth2 client in node.js?

I've noticed that all the node.js code samples for Google Analytics Admin and Google Analytics Data assume a service account and either a JSON file or a GOOGLE_APPLICATION_CREDENTIALS environment variable.
e.g.
const analyticsAdmin = require('#google-analytics/admin');
async function main() {
// Instantiates a client using default credentials.
// TODO(developer): uncomment and use the following line in order to
// manually set the path to the service account JSON file instead of
// using the value from the GOOGLE_APPLICATION_CREDENTIALS environment
// variable.
// const analyticsAdminClient = new analyticsAdmin.AnalyticsAdminServiceClient(
// {keyFilename: "your_key_json_file_path"});
const analyticsAdminClient = new analyticsAdmin.AnalyticsAdminServiceClient();
const [accounts] = await analyticsAdminClient.listAccounts();
console.log('Accounts:');
accounts.forEach(account => {
console.log(account);
});
}
I am building a service which allows users to use their own account to access their own data, so using a service account is not appropriate.
I initially thought I might be able to use the google-api-node-client -- Auth would be handled by building a URL to redirect and do the oauth dance...
Using google-api-nodejs-client:
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
// generate a url that asks permissions for Google Analytics scopes
const scopes = [
"https://www.googleapis.com/auth/analytics", // View and manage your Google Analytics data
"https://www.googleapis.com/auth/analytics.readonly", // View your Google Analytics data
];
const url = oauth2Client.generateAuthUrl({
access_type: 'offline',
scope: scopes
});
// redirect to `url` in a popup for the oauth dance
After auth, Google redirects to GET /oauthcallback?code={authorizationCode}, so we collect the code and get the token to perform subsequent OAuth2 enabled calls:
// This will provide an object with the access_token and refresh_token.
// Save these somewhere safe so they can be used at a later time.
const {tokens} = await oauth2Client.getToken(code)
oauth2Client.setCredentials(tokens);
// of course we need to handle the refresh token too
This all works fine, but is it possible to plug the OAuth2 client from the google-api-node-client code into the google-analytics-admin code?
👉 It looks like I need to somehow call analyticsAdmin.AnalyticsAdminServiceClient() with the access token I've already retrieved - but how?
The simple answer here is don't bother with the Node.js libraries for Google Analytics Admin & Google Analytics Data.
Cut out the middleman and build a very simple wrapper yourself which queries the REST APIs directly. Then you will have visibility on the whole of the process, and any errors made will be your own.
Provided you handle the refresh token correctly, this is likely all you need:
const getResponse = async (url, accessToken, options = {}) => {
const response = await fetch(url, {
...options,
headers: {
Authorization: `Bearer ${accessToken}`,
},
});
return response;
};
I use Python but the method could be similar. You should create a Credentials object based on the obtained token:
credentials = google.auth.credentials.Credentials(token=YOUR_TOKEN)
Then use it to create the client:
from google.analytics.admin import AnalyticsAdminServiceClient
client = AnalyticsAdminServiceClient(credentials=credentials)
client.list_account_summaries()

UI testing using Cypress with authentication to Azure AD using ADFS

These are my notes for how to UI test an Azure AD single page app using MSAL.js and ADFS (in our case on-premise) and the schema associated with the process of token creation and local storage.
From the tutorial: "It uses the ROPC authentication flow to acquire tokens for a test user account, and injects them into browser local storage before running the tests. This way MSAL.js does not attempt to acquire tokens as it already has them in cache."
After watching the awesome video here:
https://www.youtube.com/watch?v=OZh5RmCztrU
...and going through the repo here:
https://github.com/juunas11/AzureAdUiTestAutomation
I was stuck trying to match my use of on-premise ADFS with MSAL.js 2.0 and session store, with that of the above tutorial and code. So if you are using the link to Azure ending with /adfs/oauth2/token ( as opposed to oAuth /oauth2/v2.0/token ) - then follow the below!!
MOST of the changes I made were from auth.js: https://github.com/juunas11/AzureAdUiTestAutomation/blob/main/UiTestAutomation.Cypress/cypress/support/auth.js
Simply follow the tutorial and copy in that content, then change the following:
const environment = ''; (mine was corporate domain NOT login.windows.net)
for the Account entity (const buildAccountEntity) use:
authorityType: 'ADFS',
...and REMOVE the line: clientInfo: "",
for the Access Token entity: (const buildAccessTokenEntity):
...ADD the line: tokenType: 'bearer',
ADD a new function for the Refresh Token (new) entity:
const buildRefreshTokenEntity = (homeAccountId: string, accessToken: string) => {
return {
clientId,
credentialType: 'RefreshToken',
environment,
homeAccountId,
secret: accessToken,
};
};
next I had to MATCH my sessionStorage TOKEN by running it locally using VS Code and logging in then reverse-engineering the required KEY-VALUE pairs for what was stored (results are in next code block!).
Specifically I kept case-sensitivity for 'home account', I blanked-out some values, and had to add in the RefreshToken part, and mine used Session Storage (not local storage), and match the extended expires with the same value (based on my sample run through only):
const injectTokens = (tokenResponse: any) => {
const scopes = ['profile', 'openid'];
const idToken: JwtPayload = decode(tokenResponse.id_token) as JwtPayload;
const localAccountId = idToken.sub; // in /oauth2/v2.0/token this would be: idToken.oid || idToken.sid; however we are using /adfs/oauth2/token
const realm = ''; // in /oauth2/v2.0/token this would be: idToken.tid; however we are using /adfs/oauth2/token
const homeAccountId = `${localAccountId}`; // .${realm}`;
const homeAccountIdLowerCase = `${localAccountId}`.toLowerCase(); // .${realm}`;
const usernameFromToken = idToken.upn; // in /oauth2/v2.0/token this would be: idToken.preferred_username; however we are using /adfs/oauth2/token
const name = ''; // in /oauth2/v2.0/token this would be: idToken.name; however we are using /adfs/oauth2/token
const idTokenClaims = JSON.stringify(idToken);
const accountKey = `${homeAccountIdLowerCase}-${environment}-${realm}`;
const accountEntity = buildAccountEntity(homeAccountId, realm, localAccountId, idTokenClaims, usernameFromToken, name);
const idTokenKey = `${homeAccountIdLowerCase}-${environment}-idtoken-${clientId}-${realm}-`;
const idTokenEntity = buildIdTokenEntity(homeAccountId, tokenResponse.id_token, realm);
const accessTokenKey = `${homeAccountIdLowerCase}-${environment}-accesstoken-${clientId}-${realm}-${scopes.join(' ')}`;
const accessTokenEntity = buildAccessTokenEntity(
homeAccountId,
tokenResponse.access_token,
tokenResponse.expires_in,
tokenResponse.expires_in, // ext_expires_in,
realm,
scopes,
);
const refreshTokenKey = `${homeAccountIdLowerCase}-${environment}-refreshtoken-${clientId}-${realm}`;
const refreshTokenEntity = buildRefreshTokenEntity(homeAccountId, tokenResponse.access_token);
// localStorage was not working, needs to be in sessionStorage
sessionStorage.setItem(accountKey, JSON.stringify(accountEntity));
sessionStorage.setItem(idTokenKey, JSON.stringify(idTokenEntity));
sessionStorage.setItem(accessTokenKey, JSON.stringify(accessTokenEntity));
sessionStorage.setItem(refreshTokenKey, JSON.stringify(refreshTokenEntity));
};
Lastly, in the login function I used the /adfs link as we use on-premise ADFS and MSAL.js v2.0 and did NOT need that client_secret:
export const login = (cachedTokenResponse: any) => {
let tokenResponse: any = null;
let chainable: Cypress.Chainable = cy.visit('/'); // need to visit root to be able to store Storage against this site
if (!cachedTokenResponse) {
chainable = chainable.request({
url: authority + '/adfs/oauth2/token', // was this '/oauth2/v2.0/token',
method: 'POST',
body: {
grant_type: 'password',
client_id: clientId,
// client_secret: clientSecret,
scope: ['profile openid'].concat(apiScopes).join(' '),
username,
password,
},
form: true,
});
***... MORE CODE OMITTED***
finally I ran using VSCode terminal 1 (yarn start) then terminal 2 (yarn run cypress open)
TYPESCRIPT use:
rename all files from .js to .ts
update tsconfig to include the cypress type on this line:
"types": ["node", "cypress"],
Now when I run Cypress I can navigate around my site and I am authenticated!! Hope this helped you save an hour or two!!

Google Calendar API and Service Account permission error

I'm trying to integrate the Google Calendar API in my app.
So far i've managed to do this:
Created a new project on Cloud Platform
Enabled Calendar API
Added a new service account with role: Owner
Generated jwt.json
Granted domain-wide for that service account
Shared a calendar with that service account (modify rights)
Enabled in the GSuite the option for everyone out of the organisation to modify the events
Now, my code on node.js looks like this:
const { JWT } = require('google-auth-library');
const client = new JWT(
keys.client_email,
null,
keys.private_key,
['https://www.googleapis.com/auth/calendar']
);
const url = `https://dns.googleapis.com/dns/v1/projects/${keys.project_id}`;
const rest = await client.request({url});
console.log(rest);
The error I get is:
Sending 500 ("Server Error") response:
Error: Insufficient Permission
Anyone has any ideea? This gets frustrating.
How about this modification?
I think that in your script, the endpoint and/or scope might be not correct.
Pattern 1:
In this pattern, your endpoint of https://dns.googleapis.com/dns/v1/projects/${keys.project_id} is used.
Modified script:
const { JWT } = require("google-auth-library");
const keys = require("###"); // Please set the filename of credential file of the service account.
async function main() {
const calendarId = "ip15lduoirvpitbgc4ppm777ag#group.calendar.google.com";
const client = new JWT(keys.client_email, null, keys.private_key, [
'https://www.googleapis.com/auth/cloud-platform' // <--- Modified
]);
const url = `https://dns.googleapis.com/dns/v1/projects/${keys.project_id}`;
const res = await client.request({ url });
console.log(res.data);
}
main().catch(console.error);
In this case, it is required to enable Cloud DNS API at API console. And it is required to pay. Please be careful with this.
I thought that the reason of your error message of Insufficient Permission might be this.
Pattern 2:
In this pattern, as a sample situation, the event list is retrieved from the calendar shared with the service account. If the calendar can be used with the service account, the event list is returned. By this, I think that you can confirm whether the script works.
Modified script:
const { JWT } = require("google-auth-library");
const keys = require("###"); // Please set the filename of credential file of the service account.
async function main() {
const calendarId = "###"; // Please set the calendar ID.
const client = new JWT(keys.client_email, null, keys.private_key, [
"https://www.googleapis.com/auth/calendar"
]);
const url = `https://www.googleapis.com/calendar/v3/calendars/${calendarId}/events`; // <--- Modified
const res = await client.request({ url });
console.log(res.data);
}
main().catch(console.error);
Note:
This modified script supposes that you are using google-auth-library-nodejs of the latest version.
Reference:
JSON Web Tokens in google-auth-library-nodejs

Authentication Flow: How to use API

This flow is confusing me a bit and I would appreciate any help/diagram/flow-chart that could help me understand this better:
As an example, I would like to access Google's API. The thing is what I want to access sits on an enterprise account and to even get to use any of Google Suite Applications, I have to log in to my work account (SSO.) On top of that, all this needs to be done via VPN.
I've used Puppeteer for this in Node.js, and though it works on my machine, It stops working if I try to host it anywhere else because (I assume) due to the VPN issue. It's clunky and just plain hack-ish because I'm just automating what I normally do on the browser.
What are the best practices in being able to use Google's API? What does the algorithm look like?
you can use the 'googleapis' package on npm
https://www.npmjs.com/package/googleapis
here is an example...
const {google} = require('googleapis');
const bigquery = google.bigquery('v2');
async function main() {
// This method looks for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS
// environment variables.
const auth = new google.auth.GoogleAuth({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
});
const authClient = await auth.getClient();
const projectId = await auth.getProjectId();
const request = {
projectId,
datasetId: '<YOUR_DATASET_ID>',
// This is a "request-level" option
auth: authClient
};
const res = await bigquery.datasets.delete(request);
console.log(res.data);
}
main().catch(console.error);
you can use a proxy by setting the process.env.http or process.env.httpsenvironment variables to solve your vpn issue

Deprecation of Google plus

I got a email from Google saying that the use of all Google+ APIs are being shut off. I currently use googleAPI.google.plus to sign people in using Google. Is this plugin going to add a update to support the new way of authorizing users with Google?
Environment details:
OS: Mac OS X
Node.js version: v 10.8.0
npm version: v6.5.0
googleapis version: 33
const googleAPI = require('googleapis');
const plus = googleAPI.google.plus({
version: 'v1',
auth: configs.googleAPIKey // specify your API key here
});
// Check if Google tokens are valid
plus.people.get({
userId: googleUserId,
fields: 'displayName,emails,name,image',
access_token: googleAccessToken
})
.then((user) => {
logger.debug('From google: ' + util.inspect(user.data));
logger.debug('First Name: ' + user.data.name.givenName);
logger.debug('Last Name: ' + user.data.name.familyName);
})
You don't show how you're using that object to do sign-in, so it is a little difficult to answer.
However, the googleapis package already supports sign-ins with an OAuth2 client that you can create with something like
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
You can then get a URL to redirect them to so they can sign-in with something like
const url = oauth2Client.generateAuthUrl({
// If you only need one scope you can pass it as a string
scope: scopes
});
and then redirect them to url. Once they have signed into that URL, they'll be redirected to the URL you have specified as YOUR_REDIRECT_URL which will include a parameter called code. You'll need this code to exchange it for credentials, including the auth token
const {tokens} = await oauth2Client.getToken(code)
oauth2Client.setCredentials(tokens);
If you just need to use an API Key (which is what your example hints at), then you should just need to include the key the same way you do now for the API calls that you need to make. But that isn't related to authorization.
Since it looks like you want to get profile information, you can use something like userinfo or the People API and choose which fields you want for the user.
Using userinfo might look something like
oauth2client.userinfo.get().then( profile => {
// Handle profile info here
});
The people.get method gives you a little more control, and might look something like
const people = google.people({
version: "v1"
});
const fields = [
"names",
"emailAddresses",
"photos"
];
people.people.get({
resourceName: "people/me",
personFields: fields.join(',')
})
.then( user => {
// Handle the user results here
});

Resources