#azure/cosmos in Azure Function ConnectionStringSetting Error - node.js

I'm attempting to run the #azure/cosmos samples inside a node.js Azure Function. When it connects to the database it throws this error
" Executed 'Functions.store' (Failed,
Id=a6df6cfb-ae78-4a0b-ae83-5d51efa9fc18) [10/7/2018 9:04:18 PM]
System.Private.CoreLib: Exception while executing function:
Functions.store. Microsoft.Azure.WebJobs.Host: Unable to resolve the
value for property 'CosmosDBAttribute.ConnectionStringSetting'. Make
sure the setting exists and has a valid value.
It fails at await client.databases.createIfNotExists
Anyone get #azure/cosmos to connect inside the index.js of an azure function?
Thanks,
Donnie
const client = new CosmosClient({
endpoint: endpoint,
auth: { masterKey: masterKey }
});
async function init() {
const database = await client.databases.createIfNotExists({
id: databaseId
});
const container = await database.containers.createIfNotExists({
id: containerId
});
return container;
}
edited: added connection info
const connection = {
endpoint: "https://pdf-documents.documents.azure.com:443/",
primaryKey:
"Gub9FZeIMXwz6Lakn..."
};
const cosmos = require("#azure/cosmos");
const CosmosClient = cosmos.CosmosClient;
const endpoint = connection.endpoint;
const masterKey = connection.primaryKey;
const databaseId = "pdfDocuments";
const containerId = "pdfdocuments";
const client = new CosmosClient({
endpoint: endpoint,
auth: { masterKey: masterKey }
});

Thanks to Twitter #Ealsur for solving this for me! Even thought the error occurred in the debugger right at the moment it tried to connect to database, the error was actually an error related to another connection in the output binding of my function!
#azure/cosmos is working well inside an Azure Function.
Thanks again #Ealsur!

Related

Unable to retrieve cosmosDB data using azure JavaScript function and Key Vault secret

I am using Azure functions (JavaScript/node) to query and retrieve data from CosmosDB. That works fine. However, I haven't been successful at implementing key vault secrets to store the primary key for cosmosDB. I get the error:
Executed 'Functions.getProjects' (Failed, Id=f319f320-af1c-4283-a8f4-43cc6becb3ca,
Duration=1289ms)
[6/7/2021 4:37:44 AM] System.Private.CoreLib: Exception while executing function:
Functions.getProjects. System.Private.CoreLib: Result: Failure
Exception: Error: Required Header authorization is missing. Ensure a valid Authorization token
is passed.
I have followed multiple tutorials on what I need to do to run the code in Azure as well as what I need to do to run the code locally in VS code. To run in Azure, I created my key vault and added the secret. I enabled system assigned managed identity on my function so that it creates a service principal. I then created an access policy in key vault that allows my function/service principal GET, LIST capabilities. I get the same error when testing the function in Azure as I do when I test locally.
My code: config.js - endpoint and key obscured for security
const config = {
endpoint: "https://<mysiteonazure>.documents.azure.com:443/",
key:
"myreallylongkeyhiddenforsecurity",
databaseId: "projectsDB",
containerId: "projects",
partitionKey: { kind: "Hash", paths: ["/category"] },
};
module.exports = config;
My code: index.js
const config = require("../sharedCode/config");
const { CosmosClient } = require("#azure/cosmos");
const { DefaultAzureCredential } = require("#azure/identity");
const { SecretClient } = require("#azure/keyvault-secrets");
// this value is specified in local.settings.json file for local testing
const keyVaultName = process.env["KEY_VAULT_NAME"];
const keyVaultUri = `https://${keyVaultName}.vault.azure.net`;
// checks to see if local.settings.json has value first, indicates local
// second uses managed identity, indicating azure, since local.settings.js not uploaded
const credential = new DefaultAzureCredential();
const secretClient = new SecretClient(keyVaultUri, credential);
module.exports = async function (context, req) {
const endpoint = config.endpoint;
const key = await secretClient.getSecret("cosmosProjectKey");
const keyx = key.value;
const client = new CosmosClient({ endpoint, keyx });
const database = client.database(config.databaseId);
const container = database.container(config.containerId);
const querySpec = {
query: "SELECT * from c",
};
let myprojects = [];
const { resources: items } = await container.items
.query(querySpec)
.fetchAll();
items.forEach((item) => {
myprojects.push(`${item.id} - ${item.project}`);
});
context.res = {
// status: 200, /* Defaults to 200 */
body: items,
};
};
As I mentioned, my code works when I hard-code the key in the config file (not the best JS coding). I've removed all the comments that show that the value of the key is returned from key vault. I also left out that I created another service principal, that I believe is used when I try to access the key vault when running the function locally.
Any help greatly appreciated.
Please change the following lines of code:
const key = await secretClient.getSecret("cosmosProjectKey");
const keyx = key.value;
to
const secretKey = await secretClient.getSecret("cosmosProjectKey");
const key = secretKey.value;
And create your CosmosClient using the following
const client = new CosmosClient({ endpoint, key });
Other option would be to create your CosmosClient like this:
const client = new CosmosClient({ endpoint, key: keyx });

Could not load the default credentials? (Node.js Google Compute Engine)

I am trying to create a new vm using Nodejs client libraries of GCP, I followed the below link,
https://googleapis.dev/nodejs/compute/latest/VM.html#create
and below is my code
const Compute = require('#google-cloud/compute');
const {auth} = require('google-auth-library');
const compute = new Compute();
var cred = "<<<credential json content as string>>>";
auth.scopes = ['https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/compute'];
auth.jsonContent = JSON.parse(cred);
const config = {
machineType: 'n1-standard-1',
disks: [ {
boot: true,
initializeParams: { sourceImage: '<<<image url>>>' }
} ],
networkInterfaces: [ { network: 'global/networks/default' } ],
tags: [ { items: [ 'debian-server', 'http-server' ] } ],
auth: auth,
};
async function main() {
// [START gce_create_vm]
async function createVM() {
const zone = compute.zone('us-central1-c');
const vm = zone.vm('vm-name');
await vm.create(config).then(function(data) {
const vm = data[0];
const operation = data[1];
const apiResponse = data[2];
});
console.log(vm);
console.log('Virtual machine created!');
}
createVM().catch(function (err) {
console.log(err);
});
// [END gce_create_vm]
}
main();
when i run this, the error I am getting is
Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
at GoogleAuth.getApplicationDefaultAsync (D:\Click to deploy\src\c2dNodeGCP\node_modules\google-auth-library\build\src\auth\googleauth.js:155:19)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at async GoogleAuth.getClient (D:\Click to deploy\src\c2dNodeGCP\node_modules\google-auth-library\build\src\auth\googleauth.js:487:17)
at async GoogleAuth.authorizeRequest (D:\Click to deploy\src\c2dNodeGCP\node_modules\google-auth-library\build\src\auth\googleauth.js:528:24)
My scenario is to take the service account credential from string variable rather than from env var or some other thing.
I can see that it is trying to take the default credential which is not there in my case.
I was able to achieve this in java, but here i am not able to do it. Any help will be appreciated.
In order to execute your local application using your own user credentials for API access temporarily you can run:
gcloud auth application-default login
You have to install sdk into your computer, that will enable you to run the code.
Then log in to your associated gmail account and you will be ready.
You can check the following documentation, to get more information.
Another option is to set GOOGLE_APPLICATION_CREDENTIALS to provide authentication credentials to your application code. It should point to a file that defines the credentials.
To get this file please follow the steps:
Navigate to the APIs & Services→Credentials panel in Cloud Console.
Select Create credentials, then select API key from the dropdown menu.
The API key created dialog box displays your newly created key.
You might want to copy your key and keep it secure. Unless you are using a testing key that you intend to delete later.
Put the *.json file you just downloaded in a directory of your choosing.
This directory must be private (you can't let anyone get access to this), but accessible to your web server code.
You can write your own code to pass the service account key to the client library or set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON file downloaded.
I have found the following code that explains how you can authenticate to Google Cloud Platform APIs using the Google Cloud Client Libraries.
/**
* Demonstrates how to authenticate to Google Cloud Platform APIs using the
* Google Cloud Client Libraries.
*/
'use strict';
const authCloudImplicit = async () => {
// [START auth_cloud_implicit]
// Imports the Google Cloud client library.
const {Storage} = require('#google-cloud/storage');
// Instantiates a client. If you don't specify credentials when constructing
// the client, the client library will look for credentials in the
// environment.
const storage = new Storage();
// Makes an authenticated API request.
async function listBuckets() {
try {
const results = await storage.getBuckets();
const [buckets] = results;
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
} catch (err) {
console.error('ERROR:', err);
}
}
listBuckets();
// [END auth_cloud_implicit]
};
const authCloudExplicit = async ({projectId, keyFilename}) => {
// [START auth_cloud_explicit]
// Imports the Google Cloud client library.
const {Storage} = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
// const projectId = 'project-id'
// const keyFilename = '/path/to/keyfile.json'
const storage = new Storage({projectId, keyFilename});
// Makes an authenticated API request.
async function listBuckets() {
try {
const [buckets] = await storage.getBuckets();
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
} catch (err) {
console.error('ERROR:', err);
}
}
listBuckets();
// [END auth_cloud_explicit]
};
const cli = require(`yargs`)
.demand(1)
.command(
`auth-cloud-implicit`,
`Loads credentials implicitly.`,
{},
authCloudImplicit
)
.command(
`auth-cloud-explicit`,
`Loads credentials explicitly.`,
{
projectId: {
alias: 'p',
default: process.env.GOOGLE_CLOUD_PROJECT,
},
keyFilename: {
alias: 'k',
default: process.env.GOOGLE_APPLICATION_CREDENTIALS,
},
},
authCloudExplicit
)
.example(`node $0 implicit`, `Loads credentials implicitly.`)
.example(`node $0 explicit`, `Loads credentials explicitly.`)
.wrap(120)
.recommendCommands()
.epilogue(
`For more information, see https://cloud.google.com/docs/authentication`
)
.help()
.strict();
if (module === require.main) {
cli.parse(process.argv.slice(2));
}
You could obtain more information about this in this link, also you can take a look at this other guide for Getting started with authentication.
Edit 1
To load your credentials from a local file you can use something like:
const Compute = require('#google-cloud/compute');
const compute = new Compute({
projectId: 'your-project-id',
keyFilename: '/path/to/keyfile.json'
});
You can check this link for more examples and information.
This other link contains another example that could be useful.

NodeJS Googleapis Service Account authentication

I'm trying to perform authentication on GoogleAPIs using a Service Account. I have a service account set up, with its credentials located at credentials.json. I try to access a private sheet, to which I added the E-Mail address of the service account with editing rights.
Here the code I am using:
const {
google
} = require('googleapis');
const fs = require('fs');
let scopes = ['https://www.googleapis.com/auth/spreadsheets'];
let credentials = require("./credentials.json");
const authClient = new google.auth.JWT(
credentials.client_email,
null,
credentials.private_key,
scopes);
authClient.authorize(function(err, tokens) {
if (err) {
console.log(err);
return;
} else {
authClient.setCredentials(tokens);
}
});
const sheets = google.sheets({
version: 'v4',
authClient
});
let spreadsheetId = //...
let range = //...
const request = {
spreadsheetId: spreadsheetId,
range: range
};
sheets.spreadsheets.values.get(request, function(err, response) {
if (err) {
console.log('The API returned an error: ' + err);
} else {
console.log('Result: ' + response);
}
});
I guess the API changed over time, since many guides showed different approaches, and in the end none worked for me.
The error is as follows:
The API returned an error: Error: The request is missing a valid API key.
To my understanding, a simple API key should only be necessary for unauthenticated access on public sheets, so I don't get why it is even requiring that. If I add such an API key I get the error
The API returned an error: Error: The caller does not have permission
Using
$ npm list googleapis
`-- googleapis#52.1.0
Any help would be greatly appreciated.
For who still facing googleapis problems within NodeJS Runtime in 2022.
Firstly, redirect into Google-IAM-Admin/ServiceAccount to pick the current working project.
Secondly, click to jump into Service Account that has the following format project#sub-name-id.iam.gserviceaccount.com.
Thirdly, between [Details, Permissions, Keys, Metrics, Logs]. Jump into Keys then Add Key -> Create new Key -> Key type::JSON and save JSON file to your computer.
Here within NodeJS Runtime, I use the following Semantic Version
googleapis#100.0.0
You can create JWT Client and inject into google default auth at google.options({auth: client}); or provide auth-client to specific Service as google.chat({version: 'v1', auth: client});
However, in the following example. I create a GoogleAuth instance and then make an AuthClient after. Which resulted the same behaviour to the JWT Method.
/** Import Node Native Dependencies !*/
import * as path from "path";
/** Import ES6 Default Dependencies !*/
import {google} from "googleapis";
const {client_email, private_key} = require('$/keys/credentials.json');
/**
** #description - Google [[Service Account]] Authenticator.
**/
const auth = new google.auth.GoogleAuth({
keyFile: path.resolve('keys/credentials.json'),
/** Scopes can be specified either as an array or as a single, space-delimited string; ~!*/
scopes: [
"https://www.googleapis.com/auth/chat.bot",
],
});
const client = new google.auth.JWT({
email: client_email,
key: private_key,
/** Scopes can be specified either as an array or as a single, space-delimited string; ~!*/
scopes: [
"https://www.googleapis.com/auth/chat.bot",
],
});
(async () => {
/** #description - Either [[Get Client]] from [Google Auth] or Use directly from [JWT Client] ~!*/
const client = await auth.getClient();
/** #description - Use this Authorized Client as Default Authenticated to fallback from [Non-Authenticated Services] ~!*/
google.options({auth: client});
const chat = google.chat({
version: 'v1',
/** #description - Provide [Authenticated Services] to [Google Chat Service] Instance ~!*/
auth: client,
});
const response = await chat.spaces.members.get({
// Required. Resource name of the attachment, in the form "spaces/x/messages/x/attachments/x".
name: 'spaces',
});
console.log('response', response.data);
return void 0;
})();

Dialogflow NodeJS Knowledge IAM permission

I have been following the GitHub samples for creating, listing and adding documents to the Dialogflow's Knowledge Base. But using these NodeJS samples, I am getting errors that I need authentication required. And when I try and add some sessions (based on regular session client from Dialogflow) I get IAM permission denied.
How can I programmatically test these samples from my local NodeJS environment?
Following code ask me to do authentication
async function listKnowledgeBases(projectId) {
// [START dialogflow_list_knowledge_base]
// Imports the Dialogflow client library
const dialogflow = require('dialogflow').v2beta1;
// Instantiate a DialogFlow KnowledgeBasesClient.
const client = new dialogflow.KnowledgeBasesClient({
projectPath: projectId,
});
const formattedParent = client.projectPath(projectId);
const [resources] = await client.listKnowledgeBases({
parent: formattedParent,
});
resources.forEach(r => {
console.log(`displayName: ${r.displayName}`);
console.log(`name: ${r.name}`);
});
// [END dialogflow_list_knowledge_base]
}
Error
Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
at GoogleAuth.getApplicationDefaultAsync (/Users/c024323/Documents/Workspace/JSWorkspace/HelpCenterPOC/node_modules/google-gax/node_modules/google-auth-library/build/src/auth/googleauth.js:160:19)
at processTicksAndRejections (internal/process/task_queues.js:93:5)
at async GoogleAuth.getClient (/Users/c024323/Documents/Workspace/JSWorkspace/HelpCenterPOC/node_modules/google-gax/node_modules/google-auth-library/build/src/auth/googleauth.js:502:17)
at async GrpcClient._getCredentials (/Users/c024323/Documents/Workspace/JSWorkspace/HelpCenterPOC/node_modules/google-gax/build/src/grpc.js:92:24)
at async GrpcClient.createStub (/Users/c024323/Documents/Workspace/JSWorkspace/HelpCenterPOC/node_modules/google-gax/build/src/grpc.js:213:23)
Following code gives IAM permission denied error
const createKnowledgeBase = async (projectId, displayName) => {
// [START dialogflow_create_knowledge_base]
// Imports the Dialogflow client library
const dialogflow = require('dialogflow').v2beta1;
const sessionId = require('uuid/v1')();
let config = {
credentials: {
private_key: service_key.private_key,
client_email: service_key.client_email
}
};
// // Create a new session
// const sessionClient = new dialogflow.SessionsClient(config);
// const sessionPath = sessionClient.sessionPath(projectId, sessionId);
// Instantiate a DialogFlow client.
const client = new dialogflow.KnowledgeBasesClient(config);
const formattedParent = client.projectPath(projectId);
const knowledgeBase = {
displayName: displayName,
};
const request = {
parent: formattedParent,
knowledgeBase: knowledgeBase,
};
const [result] = await client.createKnowledgeBase(request);
console.log(`Name: ${result.name}`);
console.log(`displayName: ${result.displayName}`);
return result;
// [END dialogflow_create_knowledge_base]
};
Error
{"code":7,"details":"IAM permission 'dialogflow.knowledgeBases.create' on 'projects/XXXXX' denied.","metadata":{"internalRepr":{},"options":{}}}
Issue was with the credential key. It was for client and not for Admin. Once I created a admin role key, I was able to create KB.

Timeout error awaiting promise in Lambda?

I am testing a Serverless lambda function and get a timeout error which I believe is due to an await promise().
module.exports.create = async (event) => {
const provider = event.requestContext.identity.cognitoAuthenticationProvider
....//stuff here where I split auth token to get ids...
const cognito = new AWS.CognitoIdentityServiceProvider({
apiVersion: "2016-04-18"
});
const getUserParams = {
UserPoolId: userPoolId,
Username: userPoolUserId
};
const data =JSON.parse(event.body)
const getUser = await cognito.adminGetUser(getUserParams).promise()
const params = {
Item:{
userId: event.requestContext.identity.cognitoIdentityId,
email: getUser, //!timeout issue probably here!
content: data
}
};
try {
const { Listing } = await connectToDatabase()
const listing = await Listing.create({userId: params.Item.userId, email: params.Item.email
In researching a solution, I have come across people splitting up the lambda into two functions so that they collectively pass the timeout. I do not know how to reference a lambda within a lambda, nor am I sure this is the correct approach.
You change the timeout for lambda function
default timeout for lambda function is 3 sec you can override in below the function code basic settings
For anyone googling this: turns out adminGetUser needs a NAT Gateway configured in order for it to be able to retrieve data from Cognito. I was getting a timeout error because it was not executing, period. Read here: https://aws.amazon.com/premiumsupport/knowledge-center/internet-access-lambda-function/.

Resources