I am new here and pretty new to Node.js. I got Express working fine, connecting to MySQL (database) is going fine, and socket io is working fine.
But I decided to split many of these features up in separated files. To keep my main JS file nice and clean. I made it possible to get variables from other js files back to my main.js script. Either using exports, or global. I find global working easier since most of them are functions. It's all working fine to this point.
But now the issue that I am having. I'm loading 3 js files in my main.js file. I am requiring the first js file, I call the function that is in that js file and store the result in a variable. That's going fine. But now the second js file is suppose to use or grab this variable, and that isn't working.
My question is, how do I make that work?
It is a matter of your design.
You should use module.exports to return a variable from a file.
Example:
file1.js
function someFunction() {return true;}
module.exports = someFunction();
main.js
console.log(require('./file.js')); // true
So, if the second file depends on the variable from the first file, you may also require the file1.js in the second one.
Or, export a function that accept one parameter in the second file. The main file should then use the variable from first file to call the function.
Related
I am writing Nodejs at this moment and I was wondering what is better for requiring configuration:
In my main file I require conf.js only once and then pass it to the other files require('./jwt)(config)
In every file where I need something from the config I require it
Which one is better? I think it's the first one but I have some files that are used by the controllers (eg. jwt.js - veryfy and create token). Is it a best practise to require this module in the main file (where I don't need it) and pass the config or to use the second way?
If you are calling main file in every files then 1st one is better no need to add
var LatLonModule = require('conf.js');
in every file.
else you can choose 2nd option
I used this library, mem-fs-editor (https://github.com/sboudrias/mem-fs-editor), in a Yeoman generator a few weeks ago. It worked nicely, but now I tried to use it again in a different scope and I couldn't do anything. Obs: I used it because this is the library Yeoman provides to handle the file system.
In Yeoman Generators we can copy files from a template folder, passing values to inject in the code, to a different folder. And that's precisely what I need, but I can't use Yeoman this time.
I tried the same code I used in my Yo Generator, but it don't work. So I'm not sure how mem-fs works. No errors are thrown and even the code provided by the author of the project don't work to me.
I tried this (and some other things with copyTpl) with no success
var memFs = require('mem-fs');
var editor = require('mem-fs-editor');
var store = memFs.create();
var fs = editor.create(store);
console.log(fs.write('./somefile.js', 'var a = 1;'));
Anyone knows how it works or what else I can do to make this happen?
mem-fs-editor author here.
mem-fs stands for memory file-system. All the files you creates are stored in memory and won't get written to disk until you call:
editor.commit(callback);
Yeoman does that automatically for you. It is this way with Yeoman to collide every file changes together and then being able to only prompt for file conflicts once (rather than everytime a single file is being written to).
I'm writing a desktop app using electron and react. I want to store some information in a JSON file. I've tried both web-fs and browserify-fs to accomplish this, and neither is working as expected. My setup is as follows
project/app/(react files)
project/index.html
project/js/bundle.js
project/main.js
I'm using watchify to compile all the changes in the react files to the bundle.js file (which is read by index.html).
The following is ran from app.js in project/app/ (which is also where the JSON file is stored)
import * as fs from 'browserify-fs';
...
fs.writeFile('./fileData.json', data, function(err){
if(err)console.log(err);
else console.log("success");
});
'success' is always logged to the console, however the contents of the file is not updated, regardless of how I specify the path.
I've tried './fileData.json'
'/fileData.json'
__dirname + '/fileData.json' (which tells me that __dirname couldn't be found)
(absolute path to fileData.json) (which tells me that /Users could not be found)
After doing the above, if I change the writeFile to readFile and log the contents to the console, the updated file is printed. Even if I delete the fileData.json file, the file is successfully read.
This makes me believe that fs.writeFile() is writing to a different directory, not the one the process is being ran from. Despite this, I cannot find any other fileData.json files anywhere on my computer. There's a couple other weird behaviors:
When logging __filename (which should log the entire filepath), the only thing printed is "/app.js" with no leading file path.
Calling "process.cwd()" just gives me "/"
When calling fs.writeFile() with the full file path "/Users/...." I get a folder not found error
Anyone know what could be causing this behavior and how to fix it?
Edit - I also tried getting the absolute path by adding
var path = require('path')
var appDir = path.resolve('./app');
which again only gives me /app when it should be returning an absolute path
Can you confirm the same behavior when not using browserify-fs? Just use plain old fs. (Note you can do this straight from the Chrome dev tools console).
Looking at browserify-fs's page, it looks like it implements a kind of virtual file system using a dependency called level-filesystem (which uses level db). So the files you're expecting to get created aren't. They're being created within a level db database. You could probably find a level db file somewhere that has the information you're trying to write directly to the file system in it.
For simple writing/reading of a JSON file, I'd recommend https://github.com/sindresorhus/electron-config.
This only occurs in our application which is a ton of code and not something I can put up, nor something anyone would be willing to slog through. So all I can do is describe the error.
We are building our typescript with --amd and have a ton of recursive references between files. It all is working fine until I put this one call in a function:
var di = new moduleDocIterator.DocIterator(null);
With this, the initial load of the main .ts (.js actually) file gives me the null or undefined reference exception on setting up __extends at the top of the generated .js file.
Both the file this call is in and the file moduleDocIterator references have nothing static in them except a couple of strings. The generated .js creates the object for each class but nothing that looks like it is actually executing.
And none of this code is called at the time this happens. This is just the initial load of everything asked for via requireJS.
How do I figure out why it's unhappy?
I want to split the code into different files. I currently write all get and post methods in the same file, but I want greater readability and manageability.
I've tried to put the code in different files, but while running the main app, the rest of get and post methods in other files are not able to called.
I include this:
var Db = require('/filename.js');
// ...but I can't call those methods.
I want to split my single file code for readability. How do I achieve this?
Just have a look at the module documentation:
Starting with / looks for absolute paths, eg:
require('/home/user/module.js');
./ starts with the path where the calling file is located.
require('./lib/module.js');
__dirname has the same effect than ./:
require( __dirname + '/module.js');
Try:
var Db = require('./filename.js');
or
var Db = require('filename.js');
Also, have a look at this blog post.