Express / Jade / Pug: Calling a javascript object's functions - node.js

While I can pass an object's data, I don't know how to pass/call an object's functions.
route.js:
router.get('/', function(req, res, next) {
let test = {}; // passing an object called test
test.hello = function() { //the test object has a method I want to
console.log('hello'); //call on the browser
}
res.render('home.jade', { test:test });
On the .jade page:
//- let test = !{test}; //renders as [object Object]
let test = !{JSON.stringify(test, null, 4)}; //renders empty obj
test.hello();
console.log('test', test);
Console message:
Uncaught TypeError: test.hello is not a function
Rendered source file:
//- let test = [object Object];
let test = {};
test.hello();
console.log('test', test);
An example of what works on my.jade file (what I don't want):
let test = {};
test.hello = #{test.hello};
test.hello();
This will console out 'hello'. However, I imagine that there is a way to pass and call an object's function without this workaround.
Thanks for any help.

JSON.stringify will strip away functions since JSON format does not support functions/methods. From MDN:
Functions are not a valid JSON data type so they will not work.
However, they can be displayed if first converted to a string (e.g. in
the replacer), via the function's toString method. Also, some objects
like Date will be a string after JSON.parse().
Technically you can use eval to evaluate the resulting string to a function, though this is not recommended.

Related

How to read, update and pass variables between functions that are placed in different .js files?

So, I'm working on a discord bot that has a few functions. I'm using node.js and discord.js.
I broke down the code into a few files because it was getting too long, so now I need something to pass global variables between functions and update them each time.
The first approach I tried was through parameters, but then the variables weren't going to change for the other functions.
1.
async function prepare(message, parameters)
{
// Code here
}
For the second approach I tried using a JSON object. To read and update the values I used readFile and writeFile.
The problem was that when writing the JSON object, some datas were lost, because for some reasons the values were simplified, and that created errors afterward. In particular, the value ruined was from a ReactionCollector object.
2.
// Reads external JSON object.
let rawdata = fs.readFileSync('config.json');
let obj = JSON.parse(rawdata);
// do something with obj.
// Writes JSON object.
let data = JSON.stringify(obj);
fs.writeFileSync('config.json', data);
My last attempt was using a different type of writeFile function, that preserved the datas, but it created problems when reading the JSON object multiple times.
3.
// Reads external JSON object.
const readFile = promisify(fs.readFile);
var data = await readFile('../config.json', { encoding: 'utf8' });
let obj = JSON.parse(data);
// Do something.
// Updates JSON object.
fs.writeFile('../config.json', packageJson, { encoding: 'utf8' }, err => {
if (err) throw err;
console.log("Wrote json.");
});
Anyone that could make this code work?
I found that the best and simpler way is to use getter/setter functions for each variable.
This is an example:
var binary_tree = [];
function setBinary_tree(bt)
{
binary_tree = bt;
}
function getBinary_tree()
{
return binary_tree;
}
module.exports.setBinary_tree = setBinary_tree;
And then here it's how the variables are passed into the external file:
const { getBinary_tree, setBinary_tree } = require('./path/variables.js');
var binary_tree = getBinary_tree();
// Do something with the variable.
// At the end, updates the variables.
setBinary_tree(binary_tree);

nodejs http listener argument must be a function

Attempting to pass a responseHandler from a require rather than having it in the same file but getting error listener argument must be a function. console.log the require return, returns a function, so I don't see the issue?
var responseHandler = require("./downloader.js");
log(responseHandler); // Logs [Functions: responseHandler)
request = https.get(fileUrl, responseHandler); // Error "listener" argument must be a function (according to the log line above, it is!?)
If I swap out line 1 for the contents of downloader.js all works fine...
Content of downloader.js is just
var responseHandler = function(response){
// some code to process response.statusCode
response.on('data',function(chunk){//stuff});
response.on('error',function(e){//stuff});
response.on('end',function(e){//stuff});
}
exports.responseHandler = responseHandler;
I would like to keep the main file clean and small and have this working as a require, ideas?
If you want to only export the function you can do it with:
module.exports = responseHandler;
Then the imported value will be the function rather than an object with a function value:
var responseHandler = require("./downloader.js");
You will need to try doing:
request = https.get(fileUrl, responseHandler.responseHandler);
You're exporting an object that has a function called responseHandler, so you need to call it directly
Either you can export only function without name
module.exports = (response)=> {
// some code to process response.statusCode
response.on('data',function(chunk){//stuff});
response.on('error',function(e){//stuff});
response.on('end',function(e){//stuff});
}

Subclassing, extending or wrapping Node.js 'request' module to enhance response

I'm trying to extend request in order to hijack and enhance its response and other 'body' params. In the end, I want to add some convenience methods for my API:
var myRequest = require('./myRequest');
myRequest.get(function(err, hijackedResponse, rows) {
console.log(hijackedResponse.metadata)
console.log(rows)
console.log(rows.first)
});
According to the Node docs on inherits, I thought I could make it work (and using the EventEmitter example in the docs works OK). I tried getting it to work using #Trott's suggestion but realized that for my use case it's probably not going to work:
// myRequest.js
var inherits = require('util').inherits;
var Request = require("request").Request;
function MyRequest(options) {
Request.call(this, options);
}
inherits(MyRequest, Request);
MyRequest.prototype.pet = function() {
console.log('purr')
}
module.exports = MyRequest;
I've been toying with extend as well, hoping that I could find a way to intercept request's onRequestResponse prototype method, but I'm drawing blanks:
var extend = require('extend'),
request = require("request")
function myResponse() {}
extend(myResponse, request)
// maybe some magic happens here?
module.exports = myResponse
Ended up with:
var extend = require('extend'),
Ok = require('objectkit').Ok
function MyResponse(response) {
var rows = Ok(response.body).getIfExists('rows');
extend(response, {
metadata: extend({}, response.body),
rows: rows
});
response.first = (function() {
return rows[0]
})();
response.last = (function() {
return rows[rows.length - 1] || rows[0]
})();
delete response.metadata.rows
return response;
}
module.exports = MyResponse
Keep in mind in this example, I cheated and wrote it all inside the .get() method. In my final wrapper module, I'm actually taking method as a parameter.
UPDATED to answer the edited question:
Here's a rough template for the contents of your myResponse.js. It only implements get(). But as a bare bones, this-is-how-this-sort-of-thing-can-be-done demo, I hope it gets you going.
var request = require('request');
var myRequest = {};
myRequest.get = function (callback) {
// hardcoding url for demo purposes only
// could easily get it as a function argument, config option, whatever...
request.get('http://www.google.com/', function (error, response, body) {
var rows = [];
// only checking error here but you might want to check the response code as well
if (!error) {
// mess with response here to add metadata. For example...
response.metadata = 'I am awesome';
// convert body to rows however you process that. I'm just hardcoding.
// maybe you'll use JSON.parse() or something.
rows = ['a', 'b', 'c'];
// You can add properties to the array if you want.
rows.first = 'I am first! a a a a';
}
// now fire the callback that the user sent you...
callback(error, response, rows);
});
};
module.exports = myRequest;
ORIGINAL answer:
Looking at the source code for the Request constructor, it requires an options object that in turn requires a uri property.
So you need to specify such an object as the second parameter in your call():
Request.call(this, {uri: 'http://localhost/'});
You likely don't want to hard code uri like that inside the constructor. You probably want the code to look something more like this:
function MyRequest(options) {
Request.call(this, options);
}
...
var myRequest = new MyRequest({uri: 'http://localhost/'});
For your code to work, you will also need to move util.inherits() above the declaration for MyRequest.prototype.pat(). It appears that util.inherits() clobbers any existing prototype methods of the first argument.

How to use promise bluebird in nested for loop?

I need to use bluebird in my code and I have no idea how to use it. My code contains nested loops. When the user logs in, my code will run. It will begin to look for any files under the user, and if there are files then, it will loop through to get the name of the files, since the name is stored in a dictionary. Once it got the name, it will store the name in an array. Once all the names are stored, it will be passed along in res.render().
Here is my code:
router.post('/login', function(req, res){
var username = req.body.username;
var password = req.body.password;
Parse.User.logIn(username, password, {
success: function(user){
var Files = Parse.Object.extend("File");
var object = [];
var query = new Parse.Query(Files);
query.equalTo("user", Parse.User.current());
var temp;
query.find({
success:function(results){
for(var i=0; i< results.length; i++){
var file = results[i].toJSON();
for(var k in file){
if (k ==="javaFile"){
for(var t in file[k]){
if (t === "name"){
temp = file[k][t];
var getname = temp.split("-").pop();
object[i] = getname;
}
}
}
}
}
}
});
console.log(object);
res.render('filename', {title: 'File Name', FIles: object});
console.log(object);
},
error: function(user, error) {
console.log("Invalid username/password");
res.render('logins');
}
})
});
EDIT:The code doesn't work, because on the first and second console.log(object), I get an empty array. I am suppose to get one item in that array, because I have one file saved
JavaScript code is all parsed from top to bottom, but it doesn't necessarily execute in that order with asynchronous code. The problem is that you have the log statements inside of the success callback of your login function, but it's NOT inside of the query's success callback.
You have a few options:
Move the console.log statements inside of the inner success callback so that while they may be parsed at load time, they do not execute until both callbacks have been invoked.
Promisify functions that traditionally rely on and invoke callback functions, and hang then handlers off of the returned value to chain the promises together.
The first option is not using promises at all, but relying solely on callbacks. To flatten your code you will want to promisify the functions and then chain them.
I'm not familiar with the syntax you're using there with the success and error callbacks, nor am I familiar with Parse. Typically you would do something like:
query.find(someArgsHere, function(success, err) {
});
But then you would have to nest another callback inside of that, and another callback inside of that. To "flatten" the pyramid, we make the function return a promise instead, and then we can chain the promises. Assuming that Parse.User.logIn is a callback-style function (as is Parse.Query.find), you might do something like:
var Promise = require('bluebird');
var login = Promise.promisify(Parse.User.logIn);
var find = Promise.promisify(Parse.Query.find);
var outerOutput = [];
return login(yourArgsHere)
.then(function(user) {
return find(user.someValue);
})
.then(function(results) {
var innerOutput = [];
// do something with innerOutput or outerOutput and render it
});
This should look familiar to synchronous code that you might be used to, except instead of saving the returned value into a variable and then passing that variable to your next function call, you use "then" handlers to chain the promises together. You could either create the entire output variable inside of the second then handler, or you can declare the variable output prior to even starting this promise chain, and then it will be in scope for all of those functions. I have shown you both options above, but obviously you don't need to define both of those variables and assign them values. Just pick the option that suits your needs.
You can also use Bluebird's promisifyAll() function to wrap an entire library with equivalent promise-returning functions. They will all have the same name of the functions in the library suffixed with Async. So assuming the Parse library contains callback-style functions named someFunctionName() and someOtherFunc() you could do this:
var Parse = Promise.promisifyAll(require("Parse"));
var promiseyFunction = function() {
return Parse.someFunctionNameAsync()
.then(function(result) {
return Parse.someOtherFuncAsync(result.someProperty);
})
.then(function(otherFuncResult) {
var something;
// do stuff to assign a value to something
return something;
});
}
I have a few pointers. ... Btw tho, are you trying to use Parse's Promises?
You can get rid of those inner nested loops and a few other changes:
Use some syntax like this to be more elegant:
/// You could use a map function like this to get the files into an array of just thier names
var fileNames = matchedFiles.map(function _getJavaFile(item) {
return item && item.javaFile && item.javaFile.name // NOT NULL
&& item.javaFile.name.split('-')[0]; // RETURN first part of name
});
// Example to filter/retrieve only valid file objs (with dashes in name)
var matchedFiles = results.filter(function _hasJavaFile(item) {
return item && item.javaFile && item.javaFile.name // NOT NULL
&& item.javaFile.name.indexOf('-') > -1; // and has a dash
});
And here is an example on using Parse's native promises (add code above to line 4/5 below, note the 'then()' function, that's effectively now your 'callback' handler):
var GameScore = Parse.Object.extend("GameScore");
var query = new Parse.Query(GameScore);
query.select("score", "playerName");
query.find().then(function(results) {
// each of results will only have the selected fields available.
});

node js overriding toString

I am trying to override the default toString method for my objects, Here is the code and the problem:
function test (){
this.code = 0;//later on I will set these
this.name = "";
}
test.prototype.toString= function(){
return this.name + "\t"+ this.code +" \t "+this.anotherFunction();
}
console.log (Lion.toString()); //works correct i.e. calls my function
console.log (Lion); //doesn't call my function. Prints { code: 0, name: 'jack' }
doesn't toString get called by default?
Came across this on google before finding an answer I liked, here is what I ended up doing:
You can use inspect and v8(chrome/nodejs) will use that from console.log() calls:
function Foo() {}
Foo.prototype.inspect = function() {
return "[object Foo]";
}
console.log(new Foo());
Not always. Browsers like Chrome allow you to inspect the object (for debugging purpose) via console.log().
Try this:
console.log (''+Lion);
I was interested in how to do this too, once I saw how nicely Immutable.js prints out objects:
var Immutable = require('immutable');
var map = Immutable.Map({ a: 1, b: 2 });
console.log(map); // Map { "a": 1, "b": 2 }
After some source code scanning, I discovered they pull it off by adding both toString and inspect methods to an object's prototype. Here's the basic idea, more or less:
function Map(obj) {
this.obj = obj;
}
Map.prototype.toString = function () {
return 'Map ' + JSON.stringify(this.obj);
}
Map.prototype.inspect = function () {
return this.toString();
}
Having both toString and inspect methods means that the object will get logged out correctly in node (using inspect), and will be correctly formatted as a string if necessary (using toString).
EDIT: This only applies to node, browsers will still log out the object. If you don't want this, first convert it to a string either by calling toString or by concatenating it with another string: console.log('' + map).
No. Adding something to prototype makes it available, but doesn't mean it will be called simply because you create an instance of the object to which it belongs.
For example:
function foo() {
var bar = 123;
}
foo.prototype.show_bar = function() {
console.log(this.bar);
}
var x = new foo(); // does not log anything
x.show_bar(); // logs 123
I think your confusion is thinking that console.log() automatically tries to convert its parameter to a string. It doesn't; it can output arrays, objects, functions, etc.

Resources