Changing Wit.ai Default Max Steps - node.js

For some reason I'm unable to increase the default max steps for my chat bot.
It seems that this number is now defined in lib/config.js rather than lib/wit.js like it used to be. No matter what I change the DEFAULT_MAX_STEPS constant to in my config file my bot seems to hit the same limit (5) before throwing the 'Max steps reached, stopping' error in my log when I want the bot to send a few responses/execute a few actions in a row.
I've tried linking the file the same way the example project seems to link to the wit.js and log.js files in the module via node-wit/lib
The config file:
How I've tried to link it to my index.js file:
I'm assuming I'm not referencing the config.js file properly...

I'll write example steps of using node-wit
1) create and app folder, go to it and run: npm init
2) run npm i --save node-wit
3) app.js :
const {Wit, log, config} = require('node-wit');
const client = new Wit({accessToken: 'MY_TOKEN'});
4) from documentation:
runActions
A higher-level method to the Wit converse API. runActions resets the
last turn on new messages and errors.
Takes the following parameters:
sessionId - a unique identifier describing the user session
message - the text received from the user
context - the object representing the session state
maxSteps - (optional) the maximum number of actions to execute (defaults to 5)
so I'll add MAX_STEPS to example there:
const MAX_STEPS = 25;
const sessionId = 'some-session-id';
const context0 = {};
client
.runActions(sessionId, 'events nearby', context0, MAX_STEPS)
.then((context1) => {
return client.runActions(sessionId, 'how about in London?', context1, MAX_STEPS - 1);
})
.then((context2) => {
console.log('The session state is now: ' + JSON.stringify(context2));
})
.catch((e) => {
console.log('Oops! Got an error: ' + e);
});

Related

Node.js cannot use import statement outside a module

I'm working on a Authentification system and defaulted to jsrsasign, I started writting a node and everything was alright it opened a localhost to connect to another site worked jsut fine however I went to work on what mattered, the encoding and the code worked as plain JS
import KJUR from "./jsrsasign";
else if (pathName === '/signature') {
res.end("we code here");
function generateToken(sdkKey,sdkSecret, topic, password, Key, userIdentity, role=1){
const iat = Math.round(new Date().getTime() / 1000);
const exp = iat + 60 * 60 * 2;
const oHeader = {alg: 'HS256', typ: ' JWT'};
const oPayload = {
app_key: sdkKey,
iat: iat,
exp: exp,
version: 1,
tpc: topic,
pwd: password,
user_identity: userIdentity,
key: Key,
role_type: role
};
const sHeader = JSON .stringify(oHeader);
const sPayload = JSON.stringify(oPayload);
const signature = KJUR.jws.JWS.sign('HS256', sHeader, sPayload, sdkSecret);
return signature;
}
const Token = generateToken(sdkKey, sdkSecret, topic, password, sessionKey);
console.log(Token);
}
(if i comment this out the rest of the node works just fine) so I copied it into the node and SyntaxError: Cannot use import statement outside a module
I search online and I found out I needed a package.json and {"type" : "module"} so I did just tha went to my terminal and wrote npm install I got the packages and added the little type thing and it didn't find the file I'm trying to run called queue.js, i figured I installed them in the wrong folder , so I erased them and installed them further back and only got 1 file this time package-lock.json the other one went on a vacation or something i went to the recicling bin and got it back and move it manually and still no queue.js file could be found so I'm kinda out of ideas

Persistently receiving 14 UNAVAILABLE: Stream refused by server. Create BigTable client per-request?

It's happening daily an only on our save operation:
// the record is this small and `myValue` is a sting < 32 characters in length.
const rowToInsert = {
data: {
myKey: { value: `${params.myValue}` },
},
};
await table.row(rowId).save(rowToInsert, gaxOptions);
Which leads me to believe that clearly something is wrong and it's probably not with BigTable. Using nodejs, with the client provided from #google-cloud/bigtable the operations are invoked with the following GAX options:
import { status } from '#grpc/grpc-js';
import { CallOptions, createRetryOptions, createBackoffSettings } from 'google-gax';
// https://cloud.google.com/bigtable/docs/status-codes
const retries = 4;
const timeout = 1000;
const retryCodes = [
status.CANCELLED,
status.UNKNOWN,
status.DEADLINE_EXCEEDED,
status.FAILED_PRECONDITION,
status.ABORTED,
status.INTERNAL,
status.UNAVAILABLE,
status.DATA_LOSS,
];
const initialRetryDelayMillis = 100;
const retryDelayMultiplier = 1.3;
const maxRetryDelayMillis = 1000;
const initialRpcTimeoutMillis = null;
const rpcTimeoutMultiplier = null;
const maxRpcTimeoutMillis = null;
const totalTimeoutMillis = timeout;
const backoffSettings = createBackoffSettings(
initialRetryDelayMillis,
retryDelayMultiplier,
maxRetryDelayMillis,
initialRpcTimeoutMillis,
rpcTimeoutMultiplier,
maxRpcTimeoutMillis,
totalTimeoutMillis,
);
const options: CallOptions = {
// https://github.com/googleapis/gax-nodejs/blob/889730c1548a6dcc0b082a24c59a9278dd2296f6/src/gax.ts#L158-L159
// ignored when using retry
// timeout: ###
maxRetries: retries,
// https://github.com/googleapis/gax-nodejs/blob/889730c1548a6dcc0b082a24c59a9278dd2296f6/src/gax.ts#L349-L351
retry: createRetryOptions(safeRetryCodes, backoffSettings),
};
I would assume 4 retries should be plenty for any instability especially with the backoff. Reviewing the monitoring of my BigTable instance, it is not remotely under any strenuous load.
This leads me to the idea that maybe it's because I am initializing my client as a singleton? Maybe I should initialize the client per-request instead? Maybe the client is connecting and then something is timing out and the next request fails?
So should a BigTable client be created per-request? Is something else suspect in the GAX options above?
The examples:
https://github.com/googleapis/nodejs-bigtable/blob/master/samples/tableadmin.js
seem to show making a new client per action, but I don't know if that's just for example purposes. Same with the example on the npm page:
https://www.npmjs.com/package/#google-cloud/bigtable/v/0.16.0#using-the-client-library
I have found the same error on this GitHub Issue. The reported issue is for the Firestore API, but if you continue reading; they said that the main issue is with the #grpc/grpc-js library.
The proposed workaround is to update the library and try again, just have in mind that the issue is still open and the GCP Engineers are still working on it.
Also, you can try to open a similar issue but in the BigTable API GitHub Issue Page
Just as a quick refresh, you can update it with the command
npm update -g #grpc/grpc-js

How set API KEY in Google Translate Node.js code

I'm trying to create a Node.js code that uses google translate api.
I got the code below from the google doc (https://cloud.google.com/translate/docs/translating-text)
But when I run it, it says "Error: The request is missing a valid API key."
I have the key, but i don't know how and where to set it.
async function translate() { // Imports the Google Cloud client library
const { Translate } = require('#google-cloud/translate');
// Creates a client
const translate = new Translate();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
const text = 'Hello, world!';
const target = 'ru';
// Translates the text into the target language. "text" can be a string for
// translating a single piece of text, or an array of strings for translating
// multiple texts.
let [translations] = await translate.translate(text, target);
translations = Array.isArray(translations) ? translations : [translations];
console.log('Translations:');
translations.forEach((translation, i) => {
console.log(`${text[i]} => (${target}) ${translation}`);
});
}
translate()
This page on setting up authentication explains that you need to download a credentials file from the create service account key page. This can then be added to your path (.bashrc) as follows:
export GOOGLE_APPLICATION_CREDENTIALS="[PATH]"
Alternately, you could add the line above to a .env file on your project root and source it when you are running the application:
. ./.env
npm start
or
sh -ac '. ./.env; npm start'
Checkout this Google Authentication Page to add the key
In the GCP Console, go to the Create service account key page.
From the Service account list, select New service account.
In the Service account name field, enter a name.
From the Role list, select Project > Owner. Click
Create. A JSON file that contains your key downloads to your computer.
and
export GOOGLE_APPLICATION_CREDENTIALS="[PATH to key downloaded]"
create api-key see documentation create api key doc
use it like:
import { v2 } from '#google-cloud/translate';
const translateClint = new v2.Translate({
projectId:'your-projectId-here',
key: 'your-api-key-here',
});
I don't check it for v3, but I see the same interface:
new v3.TranslationServiceClient({
key:"may be works",
projectId:"may be works"
})
Try this... no environment variables AND please... add this file to your .gitignore
import * as credentials from 'credentials.json';
...
const {Translate} = require('#google-cloud/translate').v2;
const translationApi = new Translate({
projectId:'your-project-id',
credentials:credentials
});

Posting Message to Slack via NodeJS Lambda Function - Data Variable Not Rendering

I've got an AWS lambda function implemented in Node that posts a message to a Slack channel. I've created a SlackApp, with the incoming webhook feature enabled. I'm sending a message to the hook via https. The lambda calls the following function that formats the message:
function slackConvertFromSNSMessage(sns_event) {
let slack_message;
let slack_message_text;
const slack_message_user = 'foo';
const slack_use_markdown = true;
const sns_message_raw = sns_event.Records[0].Sns.Message;
const sns_message_date_epoc = new Date(sns_message_raw.StateChangeTime).getTime();
slack_message_text = `
*Alert:* One or more errors were report by ${sns_message_raw.AlarmName}
*Date:* <!date^${sns_message_date_epoc}^{date_num} at {time_secs}^|${sns_message_raw.StateChangeTime}>
*Region:* ${sns_message_raw.Region}
*Queue:* ${sns_message_raw.Trigger.Dimensions[0].value}
`
// "*bold* `code` _italic_ ~strike~"
slack_message = {
text: slack_message_text,
username: slack_message_user,
mrkdwn: slack_use_markdown,
}
return JSON.stringify(slack_message);
}
The message in slack appears as follows:
The variable isn't rendering. I just see the statement I'm passing to the slack API. I expect to see the supplied date formatted to the user's local time zone.
UPDATE: There was a good comment noticing the carrot before the pipe in the declaration. I removed it. I'm still getting the problem, but that line in the code now looks like the following:
*Date:* <!date^${sns_message_date_epoc}^{date_num} at {time_secs}^|${sns_message_raw.StateChangeTime}>
If you do not specify a optional_link you have to remove the last ^ delimiter, i.e. the ^ right before the |. Their documentation doesn't seem to specify this but I can reproduce the problem in the Message Builder.
Edit: And Slack expects a epoch in seconds while getTime() returns a epoch in milliseconds.

Copy object from one node to another in Cloud Functions for Firebase

I'm using Cloud Functions for Firebase, and I'm stuck with what seems to be a very basic operation.
If someone adds a post, he writes to /posts/. I want a portion of that post to be saved under a different node, called public-posts or private-posts, using the same key as was used in the initial post.
My code looks like this
const functions = require('firebase-functions');
exports.copyPost = functions.database
.ref('/posts/{pushId}')
.onWrite(event => {
const post = event.data.val();
const smallPost = (({ name, descr }) => ({ name, descr }))(post);
if (post.isPublic) {
return functions.database.ref('/public-posts/' + event.params.pushId)
.set(smallPost);
} else {
return functions.database.ref('/private-posts/' + event.params.pushId)
.set(smallPost);
}
})
The error message I get is: functions.database.ref(...).set is not a function.
What am I doing wrong?
If you want to make changes to the database in a database trigger, you either have to use the Admin SDK, or find a reference to the relevant node using the reference you've been given in the event. (You can't use functions.database to find a reference - that's used for registering triggers).
The easiest thing is probably to use event.data.ref (doc) to find a reference to the location you want to write:
const root = event.data.ref.root
const pubPost = root.child('public-posts')

Resources