fs.writeFileSync function doesn't write to file when included as a module - node.js

Consider the following:
conversations.json : []
db.js :
let fs = require('fs');
let conversations = require('./conversations.json');
function addConversation(conversation){
console.log(conversations);
conversations.push(conversation);
try{
fs.writeFileSync('conversations.json', JSON.stringify(conversations));
}
catch(err){
console.error('Parse/WriteFile Error', err)
}
}
module.exports = {
addConversation
}
app.js :
let database = require('./db.js');
database.addConversation(
{
key1: '1233',
key2: '433',
key3: '33211'
}
);
Running:
node app.js
No error is being raised. Everything compiled as expected. The problem is that the conversations.json isn't being updated once the addConversation function is called from app.js.
What's interesting is that once the addConversation is called within the db.js everything works great and the conversations.json is being updated.
What am I missing?

What am I missing?
Probably when loading as a module, you're writing the file to the wrong directory.
When you do this:
fs.writeFileSync('conversations.json', JSON.stringify(conversations));
That writes conversations.json to the current working directory which may or may not be your module directory. If you want it written to your module directory which is where this:
let conversations = require('./conversations.json');
will read it from, then you need to use __dirname to manufacture the appropriate path.
fs.writeFileSync(path.join(__dirname, 'conversations.json'), JSON.stringify(conversations));
require() automatically looks in the current module's directory when you use ./filename, but fs.writeFileSync() uses the current working directory, not your module's directory.

Related

Exported object is empty

I'm struggling since 2 days on something that shouldn't block me.
Basically, I'm building a nodeJS app that uses Express.
In my main file (located in my root folder), i'm exporting some variables/consts, for the purpose of the example I replaced them like this :
// ./index.js
const test = 'test'
module.exports = { test }
... some express initialization/routers
I then have another file that I want to use the "test" variable in, so I require my main file :
// ./aaa/bbb/ccc/test.js
const { test } = require('../../../index);
const myRouter = require('express').Router();
myRouter.get('/', function (req, res){
console.log(test) // undefined
})
I don't really know why it would be undefined as I correctly exported it, and "imported" it through my require statement.
I also tried "consoling" the whole object that I should receive, and it's empty : {}
EDIT : my "main" script that i'm executing is indeed index.js, but I highly doubt it's the reason of the problem
I can't really find out what could be the problem, and I need to export some variable to access them in my project
Thanks!
I think you did it right. The problem may be that es6 features are not acceptable on your node version. Try it like: module.exports = { test:test }.

Requiring a folder and using exports.{function}

Currently I have this setup:
// index.js
var example = require('./folder');
and:
// folder/index.js
require('./more');
// folder/test.js
exports.thing = function() {
console.log('test');
return true;
}
But when I try in index.js to call example.thing I get:
example.thing is not a function
Is there any way to make it work? Cheers.
Directory requiring is not supported by node. It requires index.js if it presented in the directory.
To export thing in index.js do the following:
// index.js
exports.thing = require('./test.js').thing;
Are you calling it right? Because I tried it and it's working.
You should call it like example.thing() and not example.thing.

pre-load / pre-require directories of .js route files

Using Express with Node.js, we might do something like this:
app.use('api/:controller/:action/:id', function(req,res,next){
var controller = req.params.controller;
var action = req.params.action;
var route = require('./routes/' + controller + '/' + action);
route(req,res,next);
}
now this is all fine and well, except there is at least one problem: the route file is dynamically loaded at runtime if this file has not been 'require'd yet. Which means it's a little bit slower at least.
Does someone have a script that recurses through a directory and pre-loads/pre-requires all the .js files when a server first starts up?
I have a similar problem for the front-end as well, using RequireJS. The solution seems to be to write a bash script that writes out all the .js filepaths in a directory and its subdirectories to a text file. then when the server starts up, it reads that text file and requires all the files in the directory that are listed in the text file. Is that the best way to do it?
If you can use io.js, it can preload modules using command-line -r or --require:
iojs -r <module_name> server.js
I created an NPM module that does this for the front-end, doing it for Node.js / CommonJS is another story.
https://www.npmjs.com/package/requirejs-metagen
you can use it like so:
var grm = require('requirejs-metagen'); //you can use with Gulp
var controllersOpts = {
inputFolder: './public/static/app/js/controllers/all',
appendThisToDependencies: 'app/js/controllers/',
appendThisToReturnedItems: '',
eliminateSharedFolder: true,
output: './public/static/app/js/meta/allControllers.js'
};
grm(controllersOpts,function(err){
//handle errors your own way
});
it generates a corresponding AMD/RequireJS module like so:
define(
[
"app/js/controllers/all/jobs",
"app/js/controllers/all/users"
],
function(){
return {
"jobs": arguments[0],
"users": arguments[1]
}
});
you can also require subdirectories and all that stuff like so:
var allViewsOpts = {
inputFolder: './public/static/app/js/jsx',
appendThisToDependencies: 'app/js/',
appendThisToReturnedItems: '',
eliminateSharedFolder: true,
output: './public/static/app/js/meta/allViews.js'
}
grm(allViewsOpts );
which generates output like so:
define([
"app/js/jsx/BaseView",
"app/js/jsx/reactComponents/FluxCart",
"app/js/jsx/reactComponents/FluxCartApp",
"app/js/jsx/reactComponents/FluxProduct",
"app/js/jsx/reactComponents/Item",
"app/js/jsx/reactComponents/Job",
"app/js/jsx/reactComponents/JobsList",
"app/js/jsx/reactComponents/listView",
"app/js/jsx/reactComponents/Picture",
"app/js/jsx/reactComponents/PictureList",
"app/js/jsx/reactComponents/RealTimeSearchView",
"app/js/jsx/reactComponents/Service",
"app/js/jsx/reactComponents/ServiceChooser",
"app/js/jsx/reactComponents/todoList",
"app/js/jsx/relViews/getAll/getAll",
"app/js/jsx/relViews/jobs/jobsView",
"app/js/jsx/standardViews/dashboardView",
"app/js/jsx/standardViews/overviewView",
"app/js/jsx/standardViews/pictureView",
"app/js/jsx/standardViews/portalView",
"app/js/jsx/standardViews/registeredUsersView",
"app/js/jsx/standardViews/userProfileView"
],
function(){
return {
"BaseView": arguments[0],
"reactComponents/FluxCart": arguments[1],
"reactComponents/FluxCartApp": arguments[2],
"reactComponents/FluxProduct": arguments[3],
"reactComponents/Item": arguments[4],
"reactComponents/Job": arguments[5],
"reactComponents/JobsList": arguments[6],
"reactComponents/listView": arguments[7],
"reactComponents/Picture": arguments[9],
"reactComponents/PictureList": arguments[10],
"reactComponents/RealTimeSearchView": arguments[11],
"reactComponents/Service": arguments[12],
"reactComponents/ServiceChooser": arguments[13],
"relViews/getAll/getAll": arguments[14],
"relViews/jobs/jobsView": arguments[15],
"standardViews/dashboardView": arguments[16],
"standardViews/overviewView": arguments[17],
"standardViews/pictureView": arguments[18],
"standardViews/portalView": arguments[19],
"standardViews/registeredUsersView": arguments[20],
"standardViews/userProfileView": arguments[21]
}
});
I need to update the library so it returns the stream so you can handle when it completes, otherwise it works great.

grunt + mochaTest: Change working directory?

I`m trying to implement testing for my nodejs-project with grunt-mocha-test and have issues with different/incorrect paths.
Like I saw it elsewhere, I want to get all dependecies by just requiring my server.js.
gruntfile.js
mochaTest: {
test: {
options: {
reporter: 'spec',
require: 'app/server.js'
},
src: ['app/test/**/*.js']
}
}
My current project structure looks like this
gruntfile.js
app/server.js
app/models/..
app/controllers/..
app/tests/..
users.controller.test.js
var userCtl = require('../controllers/users.controller');
describe("return5", function () {
it("should return 5", function () {
var result = userCtl.return5(null, null);
expect(result).toBe(5);
});
});
users.controller.js
var mongoose = require('mongoose');
var User = mongoose.model('User'); // <- Mocha crash: Schema hasn't been registered for model "User".
..
In my server.js I use:
..
// config.js: https://github.com/meanjs/mean/blob/master/config/config.js
config.getGlobbedFiles('./models/**/*.js').forEach(function (path) {
require(path); // never called with mochaTest
});
..
console.log(process.cwd()); // "C:\path\project" (missing /app)
..
So the cwd is different to what it should be.
Can someone please help me getting around this issue?
I will clarify the title as soon as I know what I`m doing wrong.
Thank you.
The confusion is due to the difference between module paths and filesystem paths.
When you do require("./blah"), the . is interpreted to mean "start with the path of the current module". Since this is relative to the module you are currently in, it will resolve to different values depending on where the module is located.
When you run process.cwd() this is returning the current working directory of the process. This does not change from module to module. It changes when your code calls process.chdir(). Also, when you perform filesystem operations that use ., this is interpreted relative to process.cwd().
So that you get C:\path\project from process.cwd() is not surprising since this is where you'd typically run Grunt (i.e. at the top level of your project). What you can do if you want paths relative to a module is use __dirname. For instance, this code reads files from a foo subdirectory in the same location where the module that contains this code is located:
var path = require("path");
var fs = require("fs");
var subdir = path.join(__dirname, "foo");
var foofiles = fs.readdirSync(subdir);

Can I load multiple files with one require statement?

maybe this question is a little silly, but is it possible to load multiple .js files with one require statement? like this:
var mylib = require('./lib/mylibfiles');
and use:
mylib.foo(); //return "hello from one"
mylib.bar(): //return "hello from two"
And in the folder mylibfiles will have two files:
One.js
exports.foo= function(){return "hello from one";}
Two.js
exports.bar= function(){return "hello from two";}
I was thinking to put a package.json in the folder that say to load all the files, but I don't know how. Other aproach that I was thinking is to have a index.js that exports everything again but I will be duplicating work.
Thanks!!
P.D: I'm working with nodejs v0.611 on a windows 7 machine
First of all using require does not duplicate anything. It loads the module and it caches it, so calling require again will get it from memory (thus you can modify module at fly without interacting with its source code - this is sometimes desirable, for example when you want to store db connection inside module).
Also package.json does not load anything and does not interact with your app at all. It is only used for npm.
Now you cannot require multiple modules at once. For example what will happen if both One.js and Two.js have defined function with the same name?? There are more problems.
But what you can do, is to write additional file, say modules.js with the following content
module.exports = {
one : require('./one.js'),
two : require('./two.js'),
/* some other modules you want */
}
and then you can simply use
var modules = require('./modules.js');
modules.one.foo();
modules.two.bar();
I have a snippet of code that requires more than one module, but it doesn't clump them together as your post suggests. However, that can be overcome with a trick that I found.
function requireMany () {
return Array.prototype.slice.call(arguments).map(function (value) {
try {
return require(value)
}
catch (event) {
return console.log(event)
}
})
}
And you use it as such
requireMany("fs", "socket.io", "path")
Which will return
[ fs {}, socketio {}, path {} ]
If a module is not found, an error will be sent to the console. It won't break the programme. The error will be shown in the array as undefined. The array will not be shorter because one of the modules failed to load.
Then you can bind those each of those array elements to a variable name, like so:
var [fs, socketio, path] = requireMany("fs", "socket.io", "path")
It essentially works like an object, but assigns the keys and their values to the global namespace. So, in your case, you could do:
var [foo, bar] = requireMany("./foo.js", "./bar.js")
foo() //return "hello from one"
bar() //return "hello from two"
And if you do want it to break the programme on error, just use this modified version, which is smaller
function requireMany () {
return Array.prototype.slice.call(arguments).map(require)
}
Yes, you may require a folder as a module, according to the node docs. Let's say you want to require() a folder called ./mypack/.
Inside ./mypack/, create a package.json file with the name of the folder and a main javascript file with the same name, inside a ./lib/ directory.
{
"name" : "mypack",
"main" : "./lib/mypack.js"
}
Now you can use require('./mypack') and node will load ./mypack/lib/mypack.js.
However if you do not include this package.json file, it may still work. Without the file, node will attempt to load ./mypack/index.js, or if that's not there, ./mypack/index.node.
My understanding is that this could be beneficial if you have split your program into many javascript files but do not want to concatenate them for deployment.
You can use destructuring assignment to map an array of exported modules from require statements in one line:
const requires = (...modules) => modules.map(module => require(module));
const [fs, path] = requires('fs', 'path');
I was doing something similar to what #freakish suggests in his answer with a project where I've a list of test scripts that are pulled into a Puppeteer + Jest testing setup. My test files follow the naming convention testname1.js - testnameN.js and I was able use a generator function to require N number of files from the particular directory with the approach below:
const fs = require('fs');
const path = require('path');
module.exports = class FilesInDirectory {
constructor(directory) {
this.fid = fs.readdirSync(path.resolve(directory));
this.requiredFiles = (this.fid.map((fileId) => {
let resolvedPath = path.resolve(directory, fileId);
return require(resolvedPath);
})).filter(file => !!file);
}
printRetrievedFiles() {
console.log(this.requiredFiles);
}
nextFileGenerator() {
const parent = this;
const fidLength = parent.requiredFiles.length;
function* iterate(index) {
while (index < fidLength) {
yield parent.requiredFiles[index++];
}
}
return iterate(0);
}
}
Then use like so:
//Use in test
const FilesInDirectory = require('./utilities/getfilesindirectory');
const StepsCollection = new FilesInDirectory('./test-steps');
const StepsGenerator = StepsCollection.nextFileGenerator();
//Assuming we're in an async function
await StepsGenerator.next().value.FUNCTION_REQUIRED_FROM_FILE(someArg);

Resources