Exported object is empty - node.js

I'm struggling since 2 days on something that shouldn't block me.
Basically, I'm building a nodeJS app that uses Express.
In my main file (located in my root folder), i'm exporting some variables/consts, for the purpose of the example I replaced them like this :
// ./index.js
const test = 'test'
module.exports = { test }
... some express initialization/routers
I then have another file that I want to use the "test" variable in, so I require my main file :
// ./aaa/bbb/ccc/test.js
const { test } = require('../../../index);
const myRouter = require('express').Router();
myRouter.get('/', function (req, res){
console.log(test) // undefined
})
I don't really know why it would be undefined as I correctly exported it, and "imported" it through my require statement.
I also tried "consoling" the whole object that I should receive, and it's empty : {}
EDIT : my "main" script that i'm executing is indeed index.js, but I highly doubt it's the reason of the problem
I can't really find out what could be the problem, and I need to export some variable to access them in my project
Thanks!

I think you did it right. The problem may be that es6 features are not acceptable on your node version. Try it like: module.exports = { test:test }.

Related

Understand the behavior of require in NodeJS

I had two javascript files in Node Project.
One was router which had routes and corresponding functions in it.
Another was a utility class where commonly used functions are written as prototype. say utilservice.getSomething();
The first file calls utilservice.getSomething() in second file.
I imported the first file in second file (which is unnecessary and unused) .Then i called an API in first file.
I got error 500 stating utilservice.getSomething() is not a function.
I spent so much hours thinking something went wrong with use of Promise and tried with async and await and landed in same error.
Very last, i removed the import and found API call happening well.
I was with an impression that require is just for import of methods in another script. But there's something beyond it. There are resources online which would tell purpose of request. But i like to understand this behavior.
Sample:
File1.js
const utilService = require('../utils/utilService');
router.get('/something',function(req,res){
utilService.getSomething().then((data)=>{
//do something
})
});
File2.js
const file = require('../file1');
function util(){}
util.prototype.getSomething = function(){
return "hello"
}
module.exports = new util();
I hit that /something API. I got utilservice.getSomething is not a function.
While require is usually used to import other modules and scripts, it also executes any code in these modules. This allows modules to overwrite properties of other modules and prototypes of objects. Modules don't require module.exports to be valid to be imported.
Ex:
File1.js
module.exports = {
foo: ()=>{
console.log("hello");
}
};
File2.js
const mod = require("./File1.js");
delete mod.foo;
index.js
const mod = require("./File1.js");
mod.foo(); // hello
require("./File2.js"); // undefined
mod.foo(); // mod.foo is not a function
Congrats, you have landed in the world of circular dependencies. Instead of just telling you what that is, I'll give you an example.
Say you had two files, file1.js and file2.js.
file1.js:
module.exports = {
doSomething: function() {
console.log("did something");
}
};
file2.js:
const util = require("./file1");
util.doSomething();
file1.js exported a function called doSomething, and file2.js called it. So currently, the dependency tree looks like this:
file2.js -> requires -> file1.js
Now problems would start happening when you modify file1.js to this:
const file2 = require("./file2");
module.exports = {
doSomething: function() {
console.log("did something");
}
};
Why? Well, let's look at the dependency tree now.
file2.js -> requires -> file1.js -> requires -> file2.js -> requires -> file1.js
See the problem? file2.js requires file1.js, which requires file2.js, but file2.js requires file1.js (etc, etc). Instead of getting stuck in an infinite loop, Node.JS loads the current state of module.exports from file1.js (which is undefined since the script got stuck on require("./file2")), and that's how you got the TypeError.
In nodeJS, require let you import modules from other files.
It helps you split your application and only load necessary files to execute them, instead of having bunches of non used code loaded for nothing.
It actualy exists in many other languages under different names: require(php), include (php), import (typescript, python...etc)
One other thing: to be able to use require properly, you must "export" modules, definitions, functions, classes...etc to be able to use them in another script, using "module.exports" or "exports". So basicaly it has to do with "context", or if you prefer "scope".
This is why there's a folder named "node_modules" when you do a "npm install" using a package.json.
Thanks to "require", you can import only what is necessary, and not the whole project files every time you do something on your app.

Code in required function doesn't execute correctly?

I'm havin trouble getting an exported function to run.
My secondary file looks like this:
require("dotenv").config()
// ...
function requestConsent(req, res) {
res.redirect(308, process.env.MY_URL + dictToURI(myHTTPparams)));
}
// ...
function dictToURI(d) {
doSomeStuff(d);
}
// ...
exports.requestConsent = requestConsent
My main file looks like this:
const api = require("./api.js")
const express = require("express")
const app = express()
// ...
app.get("/login", api.requestConsent);
// ...
When I execute this, dictToUrl works perfectly fine, but process.env.MY_URL always evaluates to undefined, even when I replace it with a string literal. It worked without issues before the functions to a seperate file.
How can I fix this? (Aside from moving it back)
As the function is in a required file, any breakpoints I set there are skipped and any console output is not visible (I'm using WebStorm 2018.3).
The problem wasn't in my code at all: My browser had a broken forwarding cached, therefor the code in question was never called in the first place.

return value to app.js file in node js

I have two files, one called filename and the second called app.js, both files are on the server side. From filename.js filder I am returing a value string from ensureAuthentication method to app.js file, so i export the function:
function ensureAuthentication(){
return 'tesstestest';
}
exports.ensureAuthentication = ensureAuthentication;
in app.js file i do following
var appjs = require('filename');
console.log(appjs.ensureAuthentication);
result is always is undifined in console??! why is that any idea?
You should try this in your app.js -
var login = require('filename');
console.log(login());
or you can use this :
var login = require('filename')();
console.log(login);
Explanation: Whenever you are exporting a function using exports you need to execute it to get the return value from it.
Try this:
var appjs = require('filename');
console.log(appjs.ensureAuthentication());
note the () after the function call. This will execute your function. The console.log() call will then print the returned value.
Try this, make sure both files are in the same directory. You have a few errors with your code. Missing brackets, and not importing correctly in app.js.
filename.js
function ensureAuthentication(){ // You are missing the brackets here.
return 'tesstestest';
}
exports.ensureAuthentication = ensureAuthentication;
app.js
var appjs = require('./filename'); // You are missing the ./ here.
console.log(appjs.ensureAuthentication()); // Logs 'tesstestest'
Two problems with your code:
You need to require with a relative path (notice the ./):
var appjs = require('./filename');
To get the string value you need to invoke ensureAuthentication as a function:
console.log(appjs.ensureAuthentication());
UPDATE
This update addresses the screenshot posted in the comments.
In the screenshot you pasted in the comments, you have the following line:
module.exports = router
That assigns a different exports object to the module. So your local reference to exports is no longer the same object.
Change that line to
module.exports = exports = router
Which will preserve the reference to exports which you use next.
Here you go with the working code
filename.js
function ensureAuthentication(){
return 'tesstestest';
}
module.exports = {
ensureAuthentication : ensureAuthentication
}
app.js
var appjs = require('./utils/sample');
console.log(appjs.ensureAuthentication());

can not find required file

I have a node file used in express which can not see a required file within a function , but when setting breakpoints, I can see it defined right after the declaration. The other variable, auth can be seen fine in both places
var auth = require('../utilities/auth');
var index = require('../utilities/index');
// here, i set a break point and index is defined
module.exports = {
create : function (req, res) {
// in here, i set a breakpoint, index is not defined
And I'm pretty sure I have the paths correct. The snippet above is from user.js
a more complete snippet is here . https://gist.github.com/foobar8675/eb5ec78461dff59a80d1
Any suggestions are appreciated!
I would be wary of using the name index.js in this context as it has special significance when modules are resolved. index.js would normally be called when require is passed a folder name, i.e.
var utilities = require('../utilities');
Can't be sure, but try changing the name of the file to something else like indexhelper.js and see what happens.
Update
I just ran a test in response to your screencast and I think I can now see your problem. Your invisible require vars are not referenced inside the module.exports scope and are thus not being captured. I just ran a test with the following snippet and saw the exact same phenomenon inside the debugger.
var mod1 = require("./mod1");
var mod2 = require("./mod2");
//both mod1 and mod2 are visible here
module.exports = {
init : function() {
//mod1 not referenced so only mod2
//is available as a local scope variable in debugger
mod2.init();
console.log("module 3 initialised")
}
};
So in summary. I don't really think you have a problem here. Just reference the variable inside module.exports and it will be captured.
See also: In what scope are module variables stored in node.js?

Can I load multiple files with one require statement?

maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);

Resources