How do I subscribe to events in PencilBlue? - pencilblue

I'm actually having trouble figuring out how to subscribe to events, or at least, doing so correctly. There doesn't seem to be much documentation on how to do so, so I took some cues from existing services.
Here's the code I'm working with:
module.exports = function(pb){
//pb dependencies
var BaseObjectService = pb.BaseObjectService;
var TYPE = 'page';
function PageProxyService() {}
PageProxyService.init = function(cb){
pb.log.debug('PageProxyService: Initialized');
cb(null, true);
};
PageProxyService.handlePageSave = function(context, cb){
// I'm using console.log to make the message stand out more.
// For production things, I use pb.log.debug :)
console.log("===================================");
console.log("I GOT A CALL");
console.log("===================================");
console.log(context);
console.log("===================================");
cb(null);
};
// Trying to subscribe to any of these seems to do nothing.
BaseObjectService.on(TYPE + '.' + BaseObjectService.BEFORE_SAVE, PageProxyService.handlePageSave);
BaseObjectService.on(TYPE + '.' + BaseObjectService.AFTER_SAVE, PageProxyService.handlePageSave);
//exports
return PageProxyService;
};
handlePageSave doesn't ever get called. What am I doing wrong?

The PageObjectService will fire events. However, as of 0.4.1 not all controllers, as you discovered, have been converted over to leverage the service. A new controller, PageApiController, has been created to take the place of the existing controller. The UI will eventually (~Q1 2016) be converted over to use the new API end-points.

Related

Get all messages from AWS SQS in NodeJS

I have the following function that gets a message from aws SQS, the problem is I get one at a time and I wish to get all of them, because I need to check the ID for each message:
function getSQSMessages() {
const params = {
QueueUrl: 'some url',
};
sqs.receiveMessage(params, (err, data) => {
if(err) {
console.log(err, err.stack)
return(err);
}
return data.Messages;
});
};
function sendMessagesBack() {
return new Promise((resolve, reject) => {
if(Array.isArray(getSQSMessages())) {
resolve(getSQSMessages());
} else {
reject(getSQSMessages());
};
});
};
The function sendMessagesBack() is used in another async/await function.
I am not sure how to get all of the messages, as I was looking on how to get them, people mention loops but I could not figure how to implement it in my case.
I assume I have to put sqs.receiveMessage() in a loop, but then I get confused on what do I need to check and when to stop the loop so I can get the ID of each message?
If anyone has any tips, please share.
Thank you.
I suggest you to use the Promise api, and it will give you the possibility to use async/await syntax right away.
const { Messages } = await sqs.receiveMessage(params).promise();
// Messages will contain all your needed info
await sqs.sendMessage(params).promise();
In this way, you will not need to wrap the callback API with Promises.
SQS doesn't return more than 10 messages in the response. To get all the available messages, you need to call the getSQSMessages function recursively.
If you return a promise from getSQSMessages, you can do something like this.
getSQSMessages()
.then(data => {
if(!data.Messages || data.Messages.length === 0){
// no messages are available. return
}
// continue processing for each message or push the messages into array and call
//getSQSMessages function again.
});
You can never be guaranteed to get all the messages in a queue, unless after you get some of them, you delete them from the queue - thus ensuring that the next requests returns a different selection of records.
Each request will return 'upto' 10 messages, if you don't delete them, then there is a good chance that the next request for 'upto' 10 messages will return a mix of messages you have already seen, and some new ones - so you will never really know when you have seen them all.
It maybe that a queue is not the right tool to use in your use case - but since I don't know your use case, its hard to say.
I know this is a bit of a necro but I landed here last night while trying to pull some all messages from a dead letter queue in SQS. While the accepted answer, "you cannot guarantee to get all messages" from the queue is absolutely correct I did want to drop an answer for anyone that may land here as well and needs to get around the 10 message limit per request from AWS.
Dependencies
In my case I have a few dependencies already in my project that I used to make life simpler.
lodash - This is something we use in our code for help making things functional. I don't think I used it below but I'm including it since it's in the file.
cli-progress - This gives you a nice little progress bar on your CLI.
Disclaimer
The below was thrown together during troubleshooting some production errors integrating with another system. Our DLQ messages contain some identifiers that I need in order to formulate cloud watch queries for troubleshooting. Given that these are two different GUIs in AWS switching back and forth is cumbersome given that our AWS session are via a form of federation and the session only lasts for one hour max.
The script
#!/usr/bin/env node
const _ = require('lodash');
const aswSdk = require('aws-sdk');
const cliProgress = require('cli-progress');
const queueUrl = 'https://[put-your-url-here]';
const queueRegion = 'us-west-1';
const getMessages = async (sqs) => {
const resp = await sqs.receiveMessage({
QueueUrl: queueUrl,
MaxNumberOfMessages: 10,
}).promise();
return resp.Messages;
};
const main = async () => {
const sqs = new aswSdk.SQS({ region: queueRegion });
// First thing we need to do is get the current number of messages in the DLQ.
const attributes = await sqs.getQueueAttributes({
QueueUrl: queueUrl,
AttributeNames: ['All'], // Probably could thin this down but its late
}).promise();
const numberOfMessage = Number(attributes.Attributes.ApproximateNumberOfMessages);
// Next we create a in-memory cache for the messages
const allMessages = {};
let running = true;
// Honesty here: The examples we have in existing code use the multi-bar. It was about 10PM and I had 28 DLQ messages I was looking into. I didn't feel it was worth converting the multi-bar to a single-bar. Look into the docs on the github page if this is really a sticking point for you.
const progress = new cliProgress.MultiBar({
format: ' {bar} | {name} | {value}/{total}',
hideCursor: true,
clearOnComplete: true,
stopOnComplete: true
}, cliProgress.Presets.shades_grey);
const progressBar = progress.create(numberOfMessage, 0, { name: 'Messages' });
// TODO: put in a time limit to avoid an infinite loop.
// NOTE: For 28 messages I managed to get them all with this approach in about 15 seconds. When/if I cleanup this script I plan to add the time based short-circuit at that point.
while (running) {
// Fetch all the messages we can from the queue. The number of messages is not guaranteed per the AWS documentation.
let messages = await getMessages(sqs);
for (let i = 0; i < messages.length; i++) {
// Loop though the existing messages and only copy messages we have not already cached.
let message = messages[i];
let data = allMessages[message.MessageId];
if (data === undefined) {
allMessages[message.MessageId] = message;
}
}
// Update our progress bar with the current progress
const discoveredMessageCount = Object.keys(allMessages).length;
progressBar.update(discoveredMessageCount);
// Give a quick pause just to make sure we don't get rate limited or something
await new Promise((resolve) => setTimeout(resolve, 1000));
running = discoveredMessageCount !== numberOfMessage;
}
// Now that we have all the messages I printed them to console so I could copy/paste the output into LibreCalc (excel-like tool). I split on the semicolon for rows out of habit since sometimes similar scripts deal with data that has commas in it.
const keys = Object.keys(allMessages);
console.log('Message ID;ID');
for (let i = 0; i < keys.length; i++) {
const message = allMessages[keys[i]];
const decodedBody = JSON.parse(message.Body);
console.log(`${message.MessageId};${decodedBody.id}`);
}
};
main();

Best way to wait for asynchronous initialization from a module import?

TL;DR: is there a way to wait for a module import with async functionality to complete before continuing with execution in the calling module in order to keep module functionality contained?
I'm working on a personal node project that I've been structuring in a modular/OOP way as the codebase has continued to grow. One requirement has been to enable logging across modules / objects, where different logfiles can be logged to at different times. I thought that I had solved the problem in a pretty clean way by creating a Logger.js file with an init function that I could use at any time by simply importing the Logger.js file in any module that I needed. Here is the stripped down code to illustrate this:
Logger.js
module.exports.init = function(location) {
var logFileBaseName = basePath + fullDatePathName;
var studentLogFile = fs.createWriteStream(logFileBaseName + '-student.log', {flags : 'a'});
var teacherLogFile = fs.createWriteStream(logFileBaseName + '-teacher.log', {flags : 'a'});
this.studentLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
studentLogFile.write(util.format.apply(null, arguments) + '\n');
}
this.teacherBookLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
teacherLogFile.write(util.format.apply(null, arguments) + '\n');
}
}
This seemed great, because in my main entrypoint I could simply do:
Main.js
const Logger = require('./utils/Logger');
Logger.init(path);
Logger.studentLog('test from Main');
// all my other code and more logging here
And in my other dozens of files I could do even less:
AnotherFile.js
const Logger = require('./utils/Logger');
Logger.studentLog('test from AnotherFile')
Then the requirement came to log not only to a file for the 'student logs', but to Discord (a chat client) as well. Seemed easy, I had this Logger file and I could just initialize Discord and log to Discord alongside the 'student logs', something like this:
Logger.js
module.exports.init = function(location) {
// code we've already seen above
var client = new Discord.Client();
client.login('my_login_string');
channels = client.channels;
this.studentLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
var message = util.format.apply(null, arguments) + '\n';
studentLogFile.write(message);
channels.get('the_channel_to_log_to').send(message)
}
// more code we've already seen above
}
The problem is that if you were to re-run Main.js again, the studentLog would fail because the .login() function is asynchronous, it returns a Promise. The login has not completed and channels would be an empty Collection by the time we try to call Logger.studentLog('test from Main');
I've tried using a Promise in Logger.js, but of course execution of Main.js continues before the promise returns in Logger.js. I would love it if Main.js could simply wait until the Discord login was complete.
My question is, what is the best way to make this work while keeping with the pattern I've been using? I know that I could wrap my entire main function in a promise.then() that waits for Discord login to complete, but that seems a bit absurd to me. I'm trying to keep functionality contained into modules and would not like for this kind of Logger code / logic to spill out into my other modules. I want to keep it to a simple Logger import as I've been doing.
Any advice would be great!!
If the result of some async function is awaited and then used in the same caller function, the result is resolved first, then used. If the result is used in another function or module (e.g. the result is assigned to a global variable), it is not resolved. In your case, if client.login() assigns a value to client.channels asynchronously, that assignment is not awaited, and channels = client.channels assignment will assign undefined to channels.
To resolve this issue, you must use a callback or return a promise from client.login(), as stated in the comments.
You can refer to this article.
Let me offer my solution to the "asynchronously initialised logger" problem. Note that this only deals with logging and most likely cannot be generalised.
Basically, all messages are appended to a queue that is only sent to the remote location once a flag inidicating that the connection is ready is set.
Example:
//Logger.js
module.exports = {
_ready: false,
_queue: [],
init(): {
return connectToRemote().then(()=>{this._ready = true})
},
log(message): {
console.log(message);
_queue.push(message)
if (this._ready) {
let messagesToSend = this._queue;
this._queue = [];
this._ready = false;
sendToRemote(messagesToSend).then(()=>this._ready = true);
}
}
}
You can require the logger in any file and use the log funciton right away. The logs will be sent only after the init funciton that you can call anytime is resolved.
This is a very bare bones example, you may also want to limit the queue size and/or only send the logs in bulk in certain time intervals, but you get the idea.

How to deal with events in nodejs/node-red

I work with node-red and develop a custom node at the moment that uses websockets to connect to a device and request data from it.
function query(node, msg, callback) {
var uri = 'ws://' + node.config.host + ':' + node.config.port;
var protocol = 'Lux_WS';
node.ws = new WebSocket(uri, protocol);
var login = "LOGIN;" + node.config.password;
node.ws.on('open', function open() {
node.status({fill:"green",shape:"dot",text:"connected"});
node.ws.send(login);
node.ws.send("REFRESH");
});
node.ws.on('message', function (data, flags) {
processResponse(data, node);
});
node.ws.on('close', function(code, reason) {
node.status({fill:"grey",shape:"dot",text:"disconnected"});
});
node.ws.on('error', function(error) {
node.status({fill:"red",shape:"dot",text:"Error " + error});
});
}
In the processResponse function I need process the first response. It gives me an XML with several ids that I need to request further data.
I plan to set up a structure that holds all the data from the first request, and populate it further with the data that results from the id requests.
And that's where my problem starts, whenever I send a query from within the processResponse function, I trigger an event that results in the same function getting called again, but then my structure is empty.
I know that this is due to the async nature of nodejs and the event system, but I simply don't see how to circumvent this behavior or do my code in the right way.
If anybody can recommend examples on how to deal with situations like this or even better could give an example, that would be great!

How do you structure sequential AWS service calls within lambda given all the calls are asynchronous?

I'm coming from a java background so a bit of a newbie on Javascript conventions needed for Lambda.
I've got a lambda function which is meant to do several AWS tasks in a particular order, depending on the result of the previous task.
Given that each task reports its results asynchronously, I'm wondering if the right way make sure they all happen in the right sequence, and the results of one operation are available to the invocation of the next function.
It seems like I have to invoike each function in the callback of the prior function, but seems like that will some kind of deep nesting and wondering if that is the proper way to do this.
For example on of these functions requires a DynamoDB getItem, following by a call to SNS to get an endpoint, followed by a SNS call to send a message, followed by a DynamoDB write.
What's the right way to do that in lambda javascript, accounting for all that asynchronicity?
I like the answer from #jonathanbaraldi but I think it would be better if you manage control flow with Promises. The Q library has some convenience functions like nbind which help convert node style callback API's like the aws-sdk into promises.
So in this example I'll send an email, and then as soon as the email response comes back I'll send a second email. This is essentially what was asked, calling multiple services in sequence. I'm using the then method of promises to manage that in a vertically readable way. Also using catch to handle errors. I think it reads much better just simply nesting callback functions.
var Q = require('q');
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
// Use a promised version of sendEmail
var ses = new AWS.SES({apiVersion: '2010-12-01'});
var sendEmail = Q.nbind(ses.sendEmail, ses);
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
var nome = event.nome;
var email = event.email;
var mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
var params = {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
};
// Here is the white-meat of the program right here.
sendEmail(params)
.then(sendAnotherEmail)
.then(success)
.catch(logErrors);
function sendAnotherEmail(data) {
console.log("FIRST EMAIL SENT="+data);
// send a second one.
return sendEmail(params);
}
function logErrors(err) {
console.log("ERROR="+err, err.stack);
context.done();
}
function success(data) {
console.log("SECOND EMAIL SENT="+data);
context.done();
}
}
Short answer:
Use Async / Await — and Call the AWS service (SNS for example) with a .promise() extension to tell aws-sdk to use the promise-ified version of that service function instead of the call back based version.
Since you want to execute them in a specific order you can use Async / Await assuming that the parent function you are calling them from is itself async.
For example:
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("SNS Push Failed:");
console.log(err.stack);
return;
}
console.log('SNS push suceeded: ' + data);
return data;
}).promise();
The important part is the .promise() on the end there. Full docs on using aws-sdk in an async / promise based manner can be found here: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-promises.html
In order to run another aws-sdk task you would similarly add await and the .promise() extension to that function (assuming that is available).
For anyone who runs into this thread and is actually looking to simply push promises to an array and wait for that WHOLE array to finish (without regard to which promise executes first) I ended up with something like this:
let snsPromises = [] // declare array to hold promises
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("Search Push Failed:");
console.log(err.stack);
return;
}
console.log('Search push suceeded: ' + data);
return data;
}).promise();
snsPromises.push(snsResult)
await Promise.all(snsPromises)
Hope that helps someone that randomly stumbles on this via google like I did!
I don't know Lambda but you should look into the node async library as a way to sequence asynchronous functions.
async has made my life a lot easier and my code much more orderly without the deep nesting issue you mentioned in your question.
Typical async code might look like:
async.waterfall([
function doTheFirstThing(callback) {
db.somecollection.find({}).toArray(callback);
},
function useresult(dbFindResult, callback) {
do some other stuff (could be synch or async)
etc etc etc
callback(null);
],
function (err) {
//this last function runs anytime any callback has an error, or if no error
// then when the last function in the array above invokes callback.
if (err) { sendForTheCodeDoctor(); }
});
Have a look at the async doco at the link above. There are many useful functions for serial, parallel, waterfall, and many more. Async is actively maintained and seems very reliable.
good luck!
A very specific solution that comes to mind is cascading Lambda calls. For example, you could write:
A Lambda function gets something from DynamoDB, then invokes…
…a Lambda function that calls SNS to get an endpoint, then invokes…
…a Lambda function that sends a message through SNS, then invokes…
…a Lambda function that writes to DynamoDB
All of those functions take the output from the previous function as input. This is of course very fine-grained, and you might decide to group certain calls. Doing it this way avoids callback hell in your JS code at least.
(As a side note, I'm not sure how well DynamoDB integrates with Lambda. AWS might emit change events for records that can then be processed through Lambda.)
Just saw this old thread. Note that future versions of JS will improve that. Take a look at the ES2017 async/await syntax that streamlines an async nested callback mess into a clean sync like code.
Now there are some polyfills that can provide you this functionality based on ES2016 syntax.
As a last FYI - AWS Lambda now supports .Net Core which provides this clean async syntax out of the box.
I would like to offer the following solution, which simply creates a nested function structure.
// start with the last action
var next = function() { context.succeed(); };
// for every new function, pass it the old one
next = (function(param1, param2, next) {
return function() { serviceCall(param1, param2, next); };
})("x", "y", next);
What this does is to copy all of the variables for the function call you want to make, then nests them inside the previous call. You'll want to schedule your events backwards. This is really just the same as making a pyramid of callbacks, but works when you don't know ahead of time the structure or quantity of function calls. You have to wrap the function in a closure so that the correct value is copied over.
In this way I am able to sequence AWS service calls such that they go 1-2-3 and end with closing the context. Presumably you could also structure it as a stack instead of this pseudo-recursion.
I found this article which seems to have the answer in native javascript.
Five patterns to help you tame asynchronis javascript.
By default Javascript is asynchronous.
So, everything that you have to do, it's not to use those libraries, you can, but there's simple ways to solve this. In this code, I sent the email, with the data that comes from the event, but if you want, you just need to add more functions inside functions.
What is important is the place where your context.done(); is going to be, he is going to end your Lambda function. You need to put him in the end of the last function.
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
var ses = new AWS.SES({apiVersion: '2010-12-01'});
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
nome = event.nome;
email = event.email;
mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
ses.sendEmail( {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
}
},
function(err, data) {
if (err) {
console.log("ERROR="+err, err.stack);
context.done();
} else {
console.log("EMAIL SENT="+data);
context.done();
}
});
}

Restarting nodejs ntwitter twitter stream with different track keywords

var twitter = require('ntwitter');
// Configure twitter
var keywords = ['hello', 'world'];
twit.stream('statuses/filter', {'track':keywords.join(',')}, function(stream) {
stream.on('data', function (data) {
console.log(data);
});
stream.on('end', function (response) {
console.log("\n====================================================");
console.log("DESTROYING");
console.log("====================================================\n");
});
setTimeout(function(){
stream.destroy();
}, 60000);
});
I'm new to nodejs. What is the best way to stop this and start it again with but a different set of keywords.
I can destroy() the stream and then create a new one. But is there anyway I can just change the track keywords without disconnecting?
I'm quite noob, so maybe this way it's not a good one and it's wasting resources, but I still don't know how to check it, so I will drop it here and hope that someone skilled told us if it's OK or it's wrong, and the most important, why?.
The way is to put the tw.stream call, inside a function and call that function with the array of words you want to track. It will start tracking new words, and stop tracking removed words:
// Array to store the tracked words
var TwitWords = [];
// Tracker function
function TrackWords(array){
tw.stream('statuses/filter',{track:array},function(stream){
stream.on('data',function(data){
console.log(data.text);
});
});
}
// Add word
function AddTwitWord(word){
if(TwitWords.indexOf(word)==-1){
TwitWords.push(word);
TrackWords(TwitWords);
}
}
// Remove word
function RemoveTwitWord(word){
if(TwitWords.indexOf(word)!=-1){
TwitWords.splice(TwitWords.indexOf(word),1);
TrackWords(TwitWords);
}
}
I hope it's ok, because it's the only way I found.

Resources