I want to split the code into different files. I currently write all get and post methods in the same file, but I want greater readability and manageability.
I've tried to put the code in different files, but while running the main app, the rest of get and post methods in other files are not able to called.
I include this:
var Db = require('/filename.js');
// ...but I can't call those methods.
I want to split my single file code for readability. How do I achieve this?
Just have a look at the module documentation:
Starting with / looks for absolute paths, eg:
require('/home/user/module.js');
./ starts with the path where the calling file is located.
require('./lib/module.js');
__dirname has the same effect than ./:
require( __dirname + '/module.js');
Try:
var Db = require('./filename.js');
or
var Db = require('filename.js');
Also, have a look at this blog post.
Related
I am writing Nodejs at this moment and I was wondering what is better for requiring configuration:
In my main file I require conf.js only once and then pass it to the other files require('./jwt)(config)
In every file where I need something from the config I require it
Which one is better? I think it's the first one but I have some files that are used by the controllers (eg. jwt.js - veryfy and create token). Is it a best practise to require this module in the main file (where I don't need it) and pass the config or to use the second way?
If you are calling main file in every files then 1st one is better no need to add
var LatLonModule = require('conf.js');
in every file.
else you can choose 2nd option
I have a single directory with a few million json files in it. I ultimately want to iterate over each file in the directory, read it, do something with the information and then write something into a database.
My script works perfectly when I use a test directory with a few hundred files. However, it stalls when I use the real directory. I strongly believe that I have pinpointed the problem to the use of:
fs.readdirSync('my dir path')
Converting this to the Async function would not help anything since I need the file names before anything else can happen anyways. However, my belief is that this operation hangs because it simply "takes too long" for it to read the entire directory.
For reference here is a broader portion of the function:
function traverseFS(){
var path = 'my dir name and path';
var files = fs.readdirSync(path);
for (var i in files) {
path + '/' + files[i];
var fileText = fs.readFileSync(currentFile,'utf8');
var json= JSON.parse(fileText);
if(json)
// do something
}
}
My question is either:
Is there something I can do get this to work using readdirSync?
Is there another operation I should be using?
You would need to either use a child process (easiest) that creates a directory listing and parse that or write your own streamable binding to scandir() (on *nix) and/or whatever the equivalent is on Windows and use that. For the latter, you may want to use the libuv code (*nix, Windows) as a guide.
I am using Node.js v6.3.1 and ncp v2.0.0
I can only get ncp to copy the contents of a directory, but not a single file within that directory.
Here is the code copying the contents of a directory recursively that works:
var ncp = require("ncp").ncp;
ncp("source/directory/", "destination/directory/", callback);
...and here is the same code but with a file as the source:
var ncp = require("ncp").ncp;
ncp("source/directory/file.txt", "destination/directory/", callback);
From this all I can think is that ncp was specifically designed to copy directories recursively, not single files maybe?
I had thought about using something like fileSystem's read/write stream functions as described here but really for consistency I was hoping to stick with ncp.
Update:
I have found another package called node-fs-extra which does what I want without the need for me to add event handlers to the operations, like I would have to do with the fileSystem read/write solution.
Here is the code that is working:
var fsExtra = require("fs-extra");
fsExtra.copy("source/directory/file.txt", "destination/directory/file.txt", callback);
Obviously this still is inconsistent, but at least is a little less verbose.
Ok I have figured out what I was doing wrong.
I was trying to copy a file into a directory, where as I needed to copy and name the file inside a directory.
So here is my original code that does not work:
var ncp = require("ncp");
ncp("source/directory/file.txt", "destination/directory/", callback);
...and here is the fixed code working, notice the inclusion of a file name in the destination directory:
var ncp = require("ncp");
ncp("source/directory/file.txt", "destination/directory/file.txt", callback);
So it looks like ncp wont just take the file as is, but needs you to specify the file name at the other end to successfully copy. I guess I was assuming that it would just copy the file with the same name into the destination directory.
I have found another package called node-fs-extra which does what I want without the need for me to add event handlers to the operations, like I would have to do with the fileSystem read/write solution.
Here is the code that is working:
var fsExtra = require("fs-extra");
fsExtra.copy("source/directory/file.txt", "destination/directory/file.txt", callback);
Obviously this still is inconsistent, but at least is a little less verbose.
Let say I need to do
require('./config/globals.js');
in many files, what is the best way to do it? Writing this line in every file or there is some more elegant way?
Searching "node.js how to require in many files" returns answers to "node.js require all files in a folder" :(
If you have some global variables for your application that consists of multiple files and you want them to be accessible via all of the files in your application, define them as global variables in your main .js file:
server.js
global.myName = 'Carl';
require('app1.js');
require('app2.js');
In that scenario, both app1.js and app2.js will be able to read and write to the variabme myName.
Change the variables defined in your globals.js to follow the structure above, and it should achieve your goals - assuming I understood the question correctly.
EDIT: As seanhodges suggested, you could also keep the globals.js and edit accordingly:
Server.js
require('globals.js)
require('app1.js');
require('app2.js');
globals.js
global.myName = 'Carl';
I am new here and pretty new to Node.js. I got Express working fine, connecting to MySQL (database) is going fine, and socket io is working fine.
But I decided to split many of these features up in separated files. To keep my main JS file nice and clean. I made it possible to get variables from other js files back to my main.js script. Either using exports, or global. I find global working easier since most of them are functions. It's all working fine to this point.
But now the issue that I am having. I'm loading 3 js files in my main.js file. I am requiring the first js file, I call the function that is in that js file and store the result in a variable. That's going fine. But now the second js file is suppose to use or grab this variable, and that isn't working.
My question is, how do I make that work?
It is a matter of your design.
You should use module.exports to return a variable from a file.
Example:
file1.js
function someFunction() {return true;}
module.exports = someFunction();
main.js
console.log(require('./file.js')); // true
So, if the second file depends on the variable from the first file, you may also require the file1.js in the second one.
Or, export a function that accept one parameter in the second file. The main file should then use the variable from first file to call the function.