I'm experiencing a problem with node. I'm trying to use a language detection algorithm, but I'm having trouble with scopes.
After saving the response to "langVastus" and then extracting the language to "keel", I get the right result inside the Algorithmia function, but not on the outside.
The console logs print out
Inside: en
Outside:
And the code looks like this:
var langVastus = "";
var keel = "";
Algorithmia.client("simpIVxv0Ex5Xen1bVCLVXnxYpr1")
.algo("nlp/LanguageIdentification/1.0.0")
.pipe(input)
.then(function(response) {
langVastus = response.get();
keel = langVastus[0].language;
console.log("Inside: " + keel);
});
console.log("Outside: " + keel);
res.render("lang", {keel: keel});
What am I doing wrong?
The problem was that I initialized the variable inside a route. Taking it outside the route fixed my problem :)
It seems like the block that you're calling "Inside", runs inside of a promise callback.
In that case the log from the outside will run before the callback is called.
Related
I am new to nodejs. Using bluebird promises to get the response of an array of HTTP API calls, and storing derived results in an ElasticSearch.
Everything is working fine, except I am unable to access the variables within the 'then' function. Below is my code:
Promise.map(bucket_paths, function(path) {
this.path = path;
return getJson.getStoreJson(things,path.path);
}, {concurrency:1}).then(function(bucketStats){
bucketStats.map(function(bucketStat) {
var bucket_stats_json = {};
bucket_stats_json.timestamp = new Date();
bucket_stats_json.name = path.name ==> NOT WORKING
});
});
How can I access the path.name variable within the 'then' ? Error says 'path' is undefined.
The best way to do this is to package the data you need from one part of the promise chain into the resolved value that is sent onto the next part of the chain. In your case with Promise.map(), you're sending an array of data onto the .then() handler so the cleanest way to pass each path down to the next stage is to make it part of each array entry that Promise.map() is resolving. It appears you can just add it to the bucketStat data structure with an extra .then() as show below. When you get the data that corresponds to a path, you then add the path into that data structure so later on when you're walking through all the results, you have the .path property for each object.
You don't show any actual result here so I don't know what you're ultimately trying to end up with, but hopefully you can get the general idea from this.
Also, I switched to Promise.mapSeries() since that's a shortcut when you want concurrency set to 1.
Promise.mapSeries(bucket_paths, function(path) {
return getJson.getStoreJson(things,path.path).then(bucketStat => {
// add the path into this item's data so we can get to it later
bucketStat.path = path;
return bucketStat;
});
}).then(function(bucketStats){
return bucketStats.map(function(bucketStat) {
var bucket_stats_json = {};
bucket_stats_json.timestamp = new Date();
bucket_stats_json.name = bucketStat.path.name;
return bucket_status_json;
});
});
TL;DR: is there a way to wait for a module import with async functionality to complete before continuing with execution in the calling module in order to keep module functionality contained?
I'm working on a personal node project that I've been structuring in a modular/OOP way as the codebase has continued to grow. One requirement has been to enable logging across modules / objects, where different logfiles can be logged to at different times. I thought that I had solved the problem in a pretty clean way by creating a Logger.js file with an init function that I could use at any time by simply importing the Logger.js file in any module that I needed. Here is the stripped down code to illustrate this:
Logger.js
module.exports.init = function(location) {
var logFileBaseName = basePath + fullDatePathName;
var studentLogFile = fs.createWriteStream(logFileBaseName + '-student.log', {flags : 'a'});
var teacherLogFile = fs.createWriteStream(logFileBaseName + '-teacher.log', {flags : 'a'});
this.studentLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
studentLogFile.write(util.format.apply(null, arguments) + '\n');
}
this.teacherBookLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
teacherLogFile.write(util.format.apply(null, arguments) + '\n');
}
}
This seemed great, because in my main entrypoint I could simply do:
Main.js
const Logger = require('./utils/Logger');
Logger.init(path);
Logger.studentLog('test from Main');
// all my other code and more logging here
And in my other dozens of files I could do even less:
AnotherFile.js
const Logger = require('./utils/Logger');
Logger.studentLog('test from AnotherFile')
Then the requirement came to log not only to a file for the 'student logs', but to Discord (a chat client) as well. Seemed easy, I had this Logger file and I could just initialize Discord and log to Discord alongside the 'student logs', something like this:
Logger.js
module.exports.init = function(location) {
// code we've already seen above
var client = new Discord.Client();
client.login('my_login_string');
channels = client.channels;
this.studentLog = function () {
arguments[0] = '[' + Utils.getFullDate() + '] ' + arguments[0].toString();
var message = util.format.apply(null, arguments) + '\n';
studentLogFile.write(message);
channels.get('the_channel_to_log_to').send(message)
}
// more code we've already seen above
}
The problem is that if you were to re-run Main.js again, the studentLog would fail because the .login() function is asynchronous, it returns a Promise. The login has not completed and channels would be an empty Collection by the time we try to call Logger.studentLog('test from Main');
I've tried using a Promise in Logger.js, but of course execution of Main.js continues before the promise returns in Logger.js. I would love it if Main.js could simply wait until the Discord login was complete.
My question is, what is the best way to make this work while keeping with the pattern I've been using? I know that I could wrap my entire main function in a promise.then() that waits for Discord login to complete, but that seems a bit absurd to me. I'm trying to keep functionality contained into modules and would not like for this kind of Logger code / logic to spill out into my other modules. I want to keep it to a simple Logger import as I've been doing.
Any advice would be great!!
If the result of some async function is awaited and then used in the same caller function, the result is resolved first, then used. If the result is used in another function or module (e.g. the result is assigned to a global variable), it is not resolved. In your case, if client.login() assigns a value to client.channels asynchronously, that assignment is not awaited, and channels = client.channels assignment will assign undefined to channels.
To resolve this issue, you must use a callback or return a promise from client.login(), as stated in the comments.
You can refer to this article.
Let me offer my solution to the "asynchronously initialised logger" problem. Note that this only deals with logging and most likely cannot be generalised.
Basically, all messages are appended to a queue that is only sent to the remote location once a flag inidicating that the connection is ready is set.
Example:
//Logger.js
module.exports = {
_ready: false,
_queue: [],
init(): {
return connectToRemote().then(()=>{this._ready = true})
},
log(message): {
console.log(message);
_queue.push(message)
if (this._ready) {
let messagesToSend = this._queue;
this._queue = [];
this._ready = false;
sendToRemote(messagesToSend).then(()=>this._ready = true);
}
}
}
You can require the logger in any file and use the log funciton right away. The logs will be sent only after the init funciton that you can call anytime is resolved.
This is a very bare bones example, you may also want to limit the queue size and/or only send the logs in bulk in certain time intervals, but you get the idea.
Ok, say I have two listeners with callbacks, and the code in one callback depends on a variable (UIDfromOnEndFunction) from the other callback.
For example:
//using andris9/mailparser on github
var mailparser = new MailParser({
streamAttachments: true
}
// OnEnd Function
mailparser.on("end", function(objMail){
**UIDfromOnEndFuntion** = objMail.UID;
saveToDB("mail" + "1234", objMail);
});
mailparser.on("attachment", function(attachment){
var output = fs.createWriteStream("attachments/"
+ **UIDfromOnEndFuntion** + "/" + attachment.generatedFileName);
// need UIDfromOnEndFunction here
attachment.stream.pipe(output);
});
How do I cause the callback in mailparser.on("attachment" to get the variable UIDfromOnEndFunction.
Does this involve promises? How do you do this?
You can do this via a closure: just access a variable from outside.
//using andris9/mailparser on github
var UIDfromOnEndFunction;
var mailparser = new MailParser({
streamAttachments: true
}
// OnEnd Function
mailparser.on("end", function(objMail){
UIDfromOnEndFuntion = objMail.UID;
saveToDB("mail" + "1234", objMail);
});
mailparser.on("attachment", function(attachment){
var output = fs.createWriteStream("attachments/" + UIDfromOnEndFuntion + "/" + attachment.generatedFileName);
attachment.stream.pipe(output);
});
Please note my comment about ensuring end is called before attachment. If they do not fire in this way, this is fundamentally impossible.
OK I came up with a solution. It's based on closures from #Brenden Ashworth 's suggestion. It's untested but I'm fairly certain it would work, and I wanted to post this before I moved on as I found I didn't need to do what the original question described to get my project working.
However, I still think it is useful to have a solution to this type of problem as the need could arise, and I don't know a better solution.
Here's my solution:
//using andris9/mailparser on github
var mailparser = new MailParser({
streamAttachments: true
}
var UIDfromOnEndFuntion;
var myAttachment;
var intNumberOfEmitsToEndAndAttachment = 0;
var funcBothEndAndAttachmentEmitted = function () {
var output = fs.createWriteStream("attachments/"
+ UIDfromOnEndFuntion + "/" + myAttachment.generatedFileName);
//UIDfromEndFunction should be garaunteed to be
//populated by .once("end",...)
myAttachment.stream.pipe(output);
//myAttachment should be gauranteed to be populated
//by .once("attachment",...)
}
mailparser.once("end", function(objMail){
UIDfromOnEndFuntion = objMail.UID;
saveToDB("mail" + "1234", objMail);
intNumberOfEmitsToEndAndAttachment++;
if (intNumberOfEmitsToEndAndAttachment == 2) {
funcBothEndAndAttachmentEmitted();
}
});
mailparser.once("attachment", function(attachment){
myAttachment = attachment;
intNumberOfEmitsToEndAndAttachment++;
if (intNumberOfEmitsToEndAndAttachment == 2) {
funcBothEndAndAttachmentEmitted();
}
});
Now this would only work for a single emitted "end" and a single emitted "attachment".
You could get more creative with how the tracking is done to handle multiple attachments. For example, instead of using an integer to track the total number of calls, an array of objects could be used like [{"attachment",attachment_args1},{"attachment",attachment_args2},{"end",end_args2}] to do the tracking of calls (this would mean attachment has been called twice so far, and "end" once, for example, and you could trigger a function based on that knowledge like I do by calling funcBothEndAndAttachmentEmitted()).
I think this needs to be cleaned up and made into a library, unless there is a better way to do it that's not apparent. (Please comment if you know a better solution or I might go ahead and write a library for this solution.)
Another solution I thought of that might work is putting mailparser.once("attachment"...) inside of the callback for mailparser.once("end"...) but I suspect that wouldn't work if "attachment" is emitted first, and this solution seems a bit cludgey compared to a library-based solution if you're working with many different emitted events for some reason or different objects emitting different events.
I'm using node.js modules installed via npm.
I'm wondering what is the best way to modify a node module functionality.
Let's say I have a module called Handler and there is a method called foo which takes a request object and returns a response object.
1) What if I want to do something to the response before it gets returned.
Do I just modify the code itself ?
Are there any articles on this ?
UPDATE --
Also, the original function is modifying a few objects that is not being returned, But I want to modify them too. How would I handle that ?
What i would do here is to create a wrapper around the function, and then change it in there. If that was unclear, here's some code:
var myModule = require('myModule');
var myModuleFunc = myModule.myFunc;
myModule.myFunc = function() {
var res = myModuleFunc.apply(this, arguments); // call the function, and pass along context and arguments
res = transform(res); // whatever you do the response
return res;
};
I am trying to get all the variables that have been defined, i tried using the global object
but it seems to be missing the ones defined as var token='44'; and only includes the ones defined as token='44';. What i am looking for idealy is something like the get_defined_vars() function of php. I need to access the variables because i need to stop the node process and then restart at the same point without having to recalculate all the variables, so i want to dump them somewhere and access them later.
It's impossible within the language itself.
However:
1. If you have an access to the entire source code, you can use some library to get a list of global variables like this:
var ast = require('uglify-js').parse(source)
ast.figure_out_scope()
console.log(ast.globals).map(function (node, name) {
return name
})
2. If you can connect to node.js/v8 debugger, you can get a list of local variables as well, see _debugger.js source code in node.js project.
As you stated
I want to dump them somewhere and access them later.
It seems like you should work towards a database (as Jonathan mentioned in the comments), but if this is a one off thing you can use JSON files to store values. You can then require the JSON file back into your script and Node will handle the rest.
I wouldn't recommend this, but basically create a variable that will hold all the data / variables that you define. Some might call this a God Object. Just make sure that before you exit the script, export the values to a JSON file. If you're worried about your application crashing, perform backups to that file more frequently.
Here is a demo you can play around with:
var fs = require('fs');
var globalData = loadData();
function loadData() {
try { return require('./globals.json'); } catch(e) {}
return {};
}
function dumpGlobalData(callback) {
fs.writeFile(
__dirname + '/globals.json', JSON.stringify(globalData), callback);
}
function randomToken() {
globalData.token = parseInt(Math.random() * 1000, 10);
}
console.log('token was', globalData.token)
randomToken();
console.log('token is now', globalData.token)
dumpGlobalData(function(error) {
process.exit(error ? 1 : 0);
});