I am trying to create a new table in BigQuery. I have followed these instructions https://codelabs.developers.google.com/codelabs/cloud-bigquery-nodejs/index.html?index=..%2F..index#9 and have my user and roles defined properly.
I created a node project, installed the google dependencies and have the following code:
const {BigQuery} = require('#google-cloud/bigquery');
const bigquery = new BigQuery({
projectId: 'myproject-develop-3fcb6',
private_key_id: "11111111111",
client_email: "myuser-bigquery-sa#myproject-develop-3fcb6.iam.gserviceaccount.com",
client_id: "212111112",
});
This is how im creating my dataset and table:
module.exports = {
createTable: ({ datasetId, tableId, schema, partitionBy}) => {
const options = { schema };
if (partitionBy) {
options.timePartitioning = {
field: partitionBy
};
}
return new Promise((resolve, reject) => {
resolve();
bigquery
.dataset(datasetId)
.createTable(tableId, options)
.then(results => resolve(results[0]))
.catch(err => {
handleError(err);
reject(err);
});
});
},
};
When i run my createTableFunction and pass in the data set name, table name, schema I get the following error immediately
ERROR: Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
How do I pass my default credentials to BigQuery so i can perform CRUD operations in node.js? Thanks
In the tutorial that you mentioned, this gcloud command creates a key.json:
gcloud iam service-accounts keys create ~/key.json --iam-account my-bigquery-sa#${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com
Then you can use the following code:
// Create a BigQuery client explicitly using service account credentials.
// by specifying the private key file.
const {BigQuery} = require('#google-cloud/bigquery');
const options = {
keyFilename: 'path/to/key.json',
projectId: 'my_project',
};
const bigquery = new BigQuery(options);
Authenticating With a Service Account Key File
I do not know where are you running your code, but in the tutorial is a line where you set the env variable therefore you do not need to authenticate using the key.json file in your code:
export GOOGLE_APPLICATION_CREDENTIALS="/home/${USER}/key.json"
GCP client libraries use a strategy called Application Default
Credentials (ADC) to find your application's credentials. When your
code uses a client library, the strategy checks for your credentials
in the following order:
First, ADC checks to see if the environment variable
GOOGLE_APPLICATION_CREDENTIALS is set. If the variable is set, ADC
uses the service account file that the variable points to. The next
section describes how to set the environment variable.
If the environment variable isn't set, ADC uses the default service
account that Compute Engine, Kubernetes Engine, Cloud Run, App Engine,
and Cloud Functions provide, for applications that run on those
services.
If ADC can't use either of the above credentials, an error occurs.
You can also pass credentials directly as parameters.
const {BigQuery} = require('#google-cloud/bigquery');
const bigQuery = new BigQuery({
projectId: "your-prject-id",
credentials: {...}, // content of json file
});
Thanks to #MahendraPatel comment.
Related
I've noticed that all the node.js code samples for Google Analytics Admin and Google Analytics Data assume a service account and either a JSON file or a GOOGLE_APPLICATION_CREDENTIALS environment variable.
e.g.
const analyticsAdmin = require('#google-analytics/admin');
async function main() {
// Instantiates a client using default credentials.
// TODO(developer): uncomment and use the following line in order to
// manually set the path to the service account JSON file instead of
// using the value from the GOOGLE_APPLICATION_CREDENTIALS environment
// variable.
// const analyticsAdminClient = new analyticsAdmin.AnalyticsAdminServiceClient(
// {keyFilename: "your_key_json_file_path"});
const analyticsAdminClient = new analyticsAdmin.AnalyticsAdminServiceClient();
const [accounts] = await analyticsAdminClient.listAccounts();
console.log('Accounts:');
accounts.forEach(account => {
console.log(account);
});
}
I am building a service which allows users to use their own account to access their own data, so using a service account is not appropriate.
I initially thought I might be able to use the google-api-node-client -- Auth would be handled by building a URL to redirect and do the oauth dance...
Using google-api-nodejs-client:
const {google} = require('googleapis');
const oauth2Client = new google.auth.OAuth2(
YOUR_CLIENT_ID,
YOUR_CLIENT_SECRET,
YOUR_REDIRECT_URL
);
// generate a url that asks permissions for Google Analytics scopes
const scopes = [
"https://www.googleapis.com/auth/analytics", // View and manage your Google Analytics data
"https://www.googleapis.com/auth/analytics.readonly", // View your Google Analytics data
];
const url = oauth2Client.generateAuthUrl({
access_type: 'offline',
scope: scopes
});
// redirect to `url` in a popup for the oauth dance
After auth, Google redirects to GET /oauthcallback?code={authorizationCode}, so we collect the code and get the token to perform subsequent OAuth2 enabled calls:
// This will provide an object with the access_token and refresh_token.
// Save these somewhere safe so they can be used at a later time.
const {tokens} = await oauth2Client.getToken(code)
oauth2Client.setCredentials(tokens);
// of course we need to handle the refresh token too
This all works fine, but is it possible to plug the OAuth2 client from the google-api-node-client code into the google-analytics-admin code?
👉 It looks like I need to somehow call analyticsAdmin.AnalyticsAdminServiceClient() with the access token I've already retrieved - but how?
The simple answer here is don't bother with the Node.js libraries for Google Analytics Admin & Google Analytics Data.
Cut out the middleman and build a very simple wrapper yourself which queries the REST APIs directly. Then you will have visibility on the whole of the process, and any errors made will be your own.
Provided you handle the refresh token correctly, this is likely all you need:
const getResponse = async (url, accessToken, options = {}) => {
const response = await fetch(url, {
...options,
headers: {
Authorization: `Bearer ${accessToken}`,
},
});
return response;
};
I use Python but the method could be similar. You should create a Credentials object based on the obtained token:
credentials = google.auth.credentials.Credentials(token=YOUR_TOKEN)
Then use it to create the client:
from google.analytics.admin import AnalyticsAdminServiceClient
client = AnalyticsAdminServiceClient(credentials=credentials)
client.list_account_summaries()
I have a NodeJS application that uses Node-Config (https://www.npmjs.com/package/config) to load application configurations. What I'm trying to do is to load secrets from Azure Keyvault to the config during startup, and ensure these are available before required (e.g. connecting to databases etc).
I have no problem connecting to and retrieving values from the Keyvault, but I am struggling with the non-blocking nature of JS. The application startup process is continuing before the config values have completed loaded (asynchronously) to the config.
One strategy could be to delay application launch to await the keyvault secrets loading How to await in the main during start up in node?
Another would be to not load them in Config but instead modify code where-ever secrets are used to load these asynchronously via promises
It seems like this will be a common problem, so I am hoping someone here can provide examples or a design pattern of the best way of ensuring remote keyvault secrets are loaded during startup.
Thanks in advance for suggestions.
Rod
I have now successfully resolved this question.
A key point to note is setting process.env['ALLOW_CONFIG_MUTATIONS']=true;
Configs are immutable by default (they can't be changed after initial setting). Since async is going to resolve these later, it's critical that you adjust this setting. Otherwise you will see asynchronous configs obtaining correct values from the keystore, but when you check with config.get they will not have been set. This really should be added to the documentation at https://github.com/node-config/node-config/wiki/Asynchronous-Configurations
My solution: first, let's create a module for the Azure keystore client - azure-keyvault.mjs :
import { DefaultAzureCredential } from '#azure/identity';
import { SecretClient } from '#azure/keyvault-secrets';
// https://learn.microsoft.com/en-us/azure/developer/javascript/how-to/with-web-app/use-secret-environment-variables
if (
!process.env.AZURE_TENANT_ID ||
!process.env.AZURE_CLIENT_ID ||
!process.env.AZURE_CLIENT_SECRET ||
!process.env.KEY_VAULT_NAME
) {
throw Error('azure-keyvault - required environment vars not configured');
}
const credential = new DefaultAzureCredential();
// Build the URL to reach your key vault
const url = `https://${process.env.KEY_VAULT_NAME}.vault.azure.net`;
// Create client to connect to service
const client = new SecretClient(url, credential);
export default client;
In the config (using #node-config) files:
process.env['ALLOW_CONFIG_MUTATIONS']=true;
const asyncConfig = require('config/async').asyncConfig;
const defer = require('config/defer').deferConfig;
const debug = require('debug')('app:config:default');
// example usage debug(`\`CASSANDRA_HOSTS\` environment variable is ${databaseHosts}`);
async function getSecret(secretName) {
const client = await (await (import('../azure/azure-keyvault.mjs'))).default;
const secret = await client.getSecret(secretName);
// dev: debug(`Get Async config: ${secretName} : ${secret.value}`);
return secret.value
}
module.exports = {
//note: defer just calculates this config at the end of config generation
isProduction: defer(cfg => cfg.env === 'production'),
database: {
// use asyncConfig to obtain promise for secret
username: asyncConfig(getSecret('DATABASE-USERNAME')),
password: asyncConfig(getSecret('DATABASE-PASSWORD'))
},
...
}
Finally modify application startup to resolve the async conferences BEFORE config.get is called
server.js
const { resolveAsyncConfigs } = require('config/async');
const config = require('config');
const P = require('bluebird');
...
function initServer() {
return resolveAsyncConfigs(config).then(() => {
// if you want to confirm the async configs have loaded
// try outputting one of them to the console at this point
console.log('db username: ' + config.get("database.username"));
// now proceed with any operations that will require configs
const client = require('./init/database.js');
// continue with bootstrapping (whatever you code is)
// in our case let's proceed once the db is ready
return client.promiseToBeReady().then(function () {
return new P.Promise(_pBootstrap);
});
});
}
I hope this helps others wishing to use config/async with remote keystores such as Azure. Comments or improvements on above welcome.
~ Rod
I have this code in nextjs that is supposed to check if a token is valid then sign in the user.
const firebaseAdmin = require("firebase-admin");
const serviceAccount = require ('../secret.json');
export const verifyIdToken = async (token) => {
if (!firebaseAdmin.apps.length) {
firebaseAdmin.initializeApp({
// credential: firebaseAdmin.credential.cert(serviceAccount),
credential: firebaseAdmin.credential.applicationDefault(),
databaseURL: "rtdb.firebaseio.com",
});
}
return await firebaseAdmin
.auth()
.verifyIdToken(token)
.catch((error) => {
throw error;
});
};
I have the windows environment variables set as firebase recommends and switched to using the applicationDefault() since as I understand,
ADC can automatically find your credentials
Problem is the application works only locally. When I deploy the website, the token is not verified and creates errors. I am serving the NextJs app through a cloud function. How can I solve this.
The error is
auth/invalid-credential
Must initialize app with a cert credential or set your Firebase project
ID as the GOOGLE_CLOUD_PROJECT environment variable to call verifyIdToken().
What the app is supposed to do is do a check server side to determine if a token is valid.
As below
export async function getServerSideProps(ctx) {
try {
const cookies = nookies.get(ctx);
const token = await verifyIdToken(cookies.token);
// the user is authenticated!
const { uid, email } = token;
return {
props: {
userData: {
email: email,
uid: uid,
},
},
};
} catch (err) {
console.log(err.code)
console.log(err.message)
return { props: {
} };
}
}
The auth/invalid-credential error message means that the Admin SDK needs to be initialized, as we can see in the Official Documentation.
The credential used to authenticate the Admin SDKs cannot be used to
perform the desired action. Certain Authentication methods such as
createCustomToken() and verifyIdToken() require the SDK to be
initialized with a certificate credential as opposed to a refresh
token or Application Default credential.
And for the ID token verification, a project ID is required. The Firebase Admin SDK attempts to obtain a project ID via one of the following methods:
If the SDK was initialized with an explicit projectId app option, the SDK uses the value of that option.
If the SDK was initialized with service account credentials, the SDK uses the project_id field of the service account JSON object.
If the GOOGLE_CLOUD_PROJECT environment variable is set, the SDK uses its value as the project ID. This environment variable is available for code running on Google infrastructure such as App Engine and Compute Engine.
So, we can initialize the Admin SDK with a service (and fulfill the second option); but, the first thing to do is authenticate a service account and authorize it to access Firebase services, you must generate a private key file in JSON format.
To generate a private key file for your service account you can do the following:
In the Firebase console, open Settings > Service Accounts.
Click Generate New Private Key, then confirm by clicking Generate Key.
Securely store the JSON file containing the key.
Once you have your JSON file, you can set a environment variable to hold your private key.
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"
And then, use it in your code like this:
admin.initializeApp({
credential: admin.credential.applicationDefault(),
databaseURL: 'https://<DATABASE_NAME>.firebaseio.com'
});
In the end I downloaded Gcloud tool and setting the GOOGLE_APPLICATION_CREDENTIALS environment variable from the tool worked. The function could then work with credential: firebaseAdmin.credential.applicationDefault(),
I have a Google Cloud Function that works well and I want, once executed before the callback, to connect to Firestore to add a document to my Notifications collection.
const Firestore = require('#google-cloud/firestore');
const firestore = new Firestore({
projectId: 'my-firebase-project',
keyFilename: 'thekey.json',
});
var fsDocument = {
'username': 'theuser',
'organization': 'theorg',
'attr': [
{k:'key1', v:'val1'},
{k:'key2', v:'val2'}
]
};
firestore.collection('Notifications').add(fsDocument).then(documentReference => {
console.log('Added document with name' + documentReference.id);
});
How can I include the key file to my google cloud function? So far, I am creating them in console.cloud.google.com.
All files in your functions directory will be sent to Cloud Functions when you deploy. You could put your credentials in a file under functions, then refer to it with a relative path like this:
const firestore = new Firestore({
projectId: 'my-firebase-project',
keyFilename: './thekey.json',
});
You should only do this if your credentials are for a project different from the one running your Cloud Function. If you're trying to access Firestore in the same project as the one running your function, just use the default credentials using the Admin SDK. There are lots of examples of this in functions-samples.
for the correct security of my app i have been obliged to set up a small nodejs server for firebase, but the problems are not over.... the firebase server sdk not support the storage apis. i read that for this is neccessary a gcloud storage apis because firebase use the same service.
in the server side is important to get a files custom metadata because i must read and update they. i not find the apposite functions to get a file metadata. in the client sdk is simple
// Create a reference to the file whose metadata we want to retrieve
var forestRef = storageRef.child('images/forest.jpg');
// Get metadata properties
forestRef.getMetadata().then(function(metadata) {
// Metadata now contains the metadata for 'images/forest.jpg'
}).catch(function(error) {
// Uh-oh, an error occurred!
});
and in gcloud storage what function can i use?? thanks
You'll want to use the getMetadata() method in gcloud:
var gcloud = require('gcloud');
// Initialize GCS
var gcs = gcloud.storage({
projectId: 'my-project',
keyFilename: '/path/to/keyfile.json'
});
// Reference an existing bucket
var bucket = gcs.bucket('foo.appspot.com');
// Reference to a file
var file = bucket.file('path/to/my/file');
// Get the file metadata
file.getMetadata(function(err, metadata, apiResponse) {
if (err) {
console.log(err);
} else {
console.log(metadata);
}
});