I'm trying to create a little application to store snippets of code using nodejs and mongodb
I'm using Coffeescript to write the app.
The problem is, i want to separate the code in modules
so i create this folder structure
/app
/lib
/models
/routes
core.coffee
The core.coffe is the "server" app using expressjs
so in this file i have
mongoose = module.exports.mongoose = require 'mongoose'
app = module.exports.app = express.createServer()
Snippet = module.exports.Snippet = require __dirname+'/lib/models/Snippet'
#App configurations
routes = require(__dirname+'/lib/routes/general')
In lib/models/Snippet
mongoose = module.parent.exports.mongoose
Snippet = new mongoose.Schema
title:
type: String
default:'Title'
mongoose.model 'Snippet',Snippet
exports.Snippet = mongoose.model 'Snippet'
In /lib/routes/general.coffee
app = module.parent.exports.app
mongoose = module.parent.exports.mongoose
Snippet = module.parent.exports.Snippet
app.get '/test', (req,res)->
snip = new Snippet()
res.send snip
But this don't work i get the following error message
TypeError: object is not a function
at Object.CALL_NON_FUNCTION_AS_CONSTRUCTOR (native)
How can I accomplish that?
I see a noteworthy typo:
Snippet = module.exports.Snippt = require __dirname+'/lib/models/Snippet'
Change module.exports.Snippt to module.exports.Snippet.
Let's start by looking at how you're using require. It looks like you're trying to load all the project's requirements in core.coffee, and then re-export them elsewhere. That's an odd way of doing it, most people just require those libraries in each module that needs them (for now at least, see the end of my answer).
For example, you need mongoose in lib/models/Snippet, so just require it there:
lib/models/Snippet:
mongoose = require 'mongoose'
Next, there's no need to use __dirname to require a relative path, require copes fine with a path starting with ./:
require './lib/models/Snippet'
I still wasn't able to get the code to work cleanly (I'm guessing we're not seeing the full code), but it might be enough to set you on the right path.
Finally, if you want to go down the route of exporting everything on the main module can I suggest taking a look at dave-elkan's layers project. The plain version doesn't support coffeescript, but I've created a fork that does.
It's very lightweight, and makes almost no assumptions about your project structure. The basic idea is that you give layers() your express app object and a directory. Layers will scan that directory and set up any subdirectories as layers on your app object.
In your case you'd pass in a rootPath: __dirname + '/lib' and your app object would get app.models.Snippet and app.routes.general added onto it. That's still not quite how I'd structure it, but you might be able to come up with something that matches your style from there.
Related
I have two JavaScript files: config.js, app.js. In app.js I want to use function defined in config.js so I could use require().
config.js
module.exports = {
somefunc: somefunc
}
app.js
var config = require('./config')
But I don't want to input the './' every time so I add a myRequire.js file.
myRequire.js
global.myRequire = function (p){
return require('./' + p)
}
In that case I could use myRequire('config') next time instead of myRequire('./config'), which might looks more concise.
app.js
require("./myRequire")
var config = myRequire('config')
config.somefunc()
But I met a problem, that I cannot use F12(Go to Definition) in VS Code to find the somefunc function. So could someone tell me what should I do to make it work in this case?
You are introducing a lot more problem than what you are trying to achieve. What if you are requiring a file from different file path '../../here', './over/there'.
If you really want to require something without a path. You can create your own npm module so you can require it globally without paths OR you create a folder with index.js in it and you require all the things you need.
Hi I'm new to nodeJs and currently developing a Rest API using node.I'm planning to develop it with a good folder structure, so I can scale it up easily. There I'm using several route files according to the business logic.
ex :- authRoutes,profileRoutes,orderRoutes ......
Currently in every single route file I had to include following codes
var express = require('express');
var router = express.Router();
var jwt = require('jsonwebtoken');
var passport = require('passport');
My problem is , Is it totally fine to use above code segments in all the route files( I'm concerning about the code optimisation/coding standards and execution speed ) or is there any better way to do this.
It's better if you can explain the functionality of require() function.
Thanks
In my experience, this is very standard to do. I suggest reading this question and answer to learn more about the way it affects the speed of your programs.
TL;DR of require()
When your lines of code are ran that include a variable that exists because of a require(), for instance
var https = require('https');
https.get(url, function(response) {...});
The compiler reads it and goes into the https module folder, and looks for the .get function.
However, if you are trying to require() a certain JavaScript file, such as analysis.js, you must navigate to that file from the file you are currently in. For instance, if the file you want is on the same level as the file you are in, you can access it like this:
var analysis = require('./analysis.js');
//Let analysis have a function called analyzeWeather
analysis.analyzeWeather(weather_data);
These lines of code are a little different from above. In this require() statement, we are saying grab the .js file with this name, analysis. Once you require it, you can access any public function inside of that analysis.js file.
Edits
Added require() example for .js file.
I'm working on a MEAN (Mongo Express.js Angular Node.js) CRUD application. I have it working but everything is in one .js file. The single source code file is quite large. I want to refactor the code so CRUD functionality is in different source code files. Reading through other posts, I've got a working model but am not sure it is the right way in Node using Mongo to get it done.
Here's the code so far:
<pre>
var express = require('express');
var bodyParser = require('body-parser');
var app = express();
var path = require('path');
var db;
var connect = 'mongodb://<<mddbconnect string>>;
const MongoClient = require('mongodb').MongoClient;
var ObjectID = require("mongodb").ObjectID;
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.static(__dirname + '/'));
// viewed at http://localhost:<<port referecnes in app.listen>>
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname + '/index.html'));
});
MongoClient.connect(connect, (err, database) => {
if (err) return console.log(err)
db = database
app.listen(3000, () => {
console.log('listening on 3000' + Date() );
// Here's the require for the search function in another source code file.
var searchroute = require('./serverSearch')(app, db);
})
})
//Handlers
The rest of the CRUD application functions with app.post, app.get. These are other functions I want to move into different source code files, like serverSearch.js.
</pre>
The code I separated right now is the search functionality which is inside of the MongoClient.connection function. This function has to successfully execute to make sure the variable 'db' is valid before passing both variables 'app' and 'db' to the the search function built out in the source code file serverSearch.js.
I could now build out my other CRUD functions in separate files in put them in the same area as 'var searchroute = require('./serverSearch)(app,db);
Is this the best way to separate code in a MEAN application where the main app and db vars need to be instantiated then passed to functions in other source code files?
What you are basically describing is modular coding heading towards "services" perhaps even micro-services. There are a few factors to keep in mind for your system. (I have no doubt that there are many other approaches to this btw). Basically in most NodeJS systems I have worked on (not all) I try to apply the following architecture in development and then bring over as much as possible I to production.
Create a directory under the main one. I usually use some type of name that points to the term functions. In this directory I maintain function and /or class files divided into categories. Function wrappers for DB would be held in DB functions. This file would only contain functions for the DB. Security functions in another file. Helper functions in another. Time manipulation in another. I am sure you get the idea. These are all wrapped in module exports
Now in any file in my project where say I would need DB and helpers I will start it by:
let nhelpers = require("helpfuncs");
let ndb = require("dbfuncs");
Obviously names are different.
And btw I divide all the NPM packages in the same way under an environment directory.
Maintaining that kind of structure allows you to maintain sane order over the code, logical chaining in any decent IDE, and having relevant methods show up in your IDE without having to remember every function name and all the methods within.
It also allows you to write an orderly system of micro-services making sure each part dies exactly what you want and allows for sane debugging.
It took me awhile to settle on this method and refine it.
It paid off for me. Hope this helps.
Edit to clarify for the OP:
When it comes to the process.env variables I became a great fan of dotenv https://www.npmjs.com/package/dotenv
This little package has saved me an incredible amount of headaches. Of course you will have to decide if you include it in production or not. I have seen arguments for both, but i think in a well set up AWS, Google, Azure environment (or in Docker of course) I am of the opinion it can safely be used.
A couple of caveats.
Do not leave your dotenv file in the root. Move it somewhere else in your directory structure. It is simple and I actually put it in the same directory as all my environment files and helper files.
Remember it is simply a text file. So an IDE will not pick up your specific env variables in chaining. (Unless someone knows of a trick which I would love to hear about)
If you put env variables like access info to your DB system or other sensitive stuff, HASH THEM FIRST, put the hash in your env and have a function in your code which specifically just does the hash to the string. Do not under any conditions leave sensitive information in your environment file without hashing it first.
The final Gatcha. These are not PHP magic globals which cannot be overwritten. If you lose track and overwrite one of those process.env variables in your code it will take the new value until you restart your node app and it reads from the dotenv file again. (But then again that is the rule with all environment variables not only user defined ones)
Any typos above excuse me. Done from my cell in the train.
I am having trouble with require executing my code twice. Working on a standard Express app I build Mongoose Schemas, each in it's own files and export them.
//user.js
const User = mongoose.model('User', userSchema)
module.exports = User
//In other files
const User = require('../models/User')
Now I use this in two places in my application and get an error saying that
Cannot overwrite `User` model once compiled.
So the code above is getting called twice as it is the only code right now creating a model. However I would expect Node to only execute it once since it is required in my code.
The really strange part is that checking out an earlier version from Git I get the same error and people working with me on this get the same error. So I have no more ideas where to look for solutions.
Found the solution now.
Turns out I required the module once as models/user and once as model/User which in the cache of require creates two separate modules.
There have been many discussion about this:
one issue
another issue
old PR
It seems that this is due to Windows resolving paths case insensitive while other systems resolve paths case sensitive and node therefore doing it sensitive.
And a new module of'cause gets executed. Simply requiring is both times spelled in lowercase solved the issue.
I think the problem is in "const" that you use to declare the variable "User".
Try to use "var" instead of .
//user.js
var User = mongoose.model('User', userSchema)
module.exports = User
//In other files
var User = require('../models/User')
P/S: This is link that clarify more about "const" and "var":
Const in javascript? When to use it and is it necessary
Hope it helpful for you !
I wrote a simple npm module to precompile my handlebars templates when using django compressor to do post-processing for some client side components and found that I need to ship the npm module with a few js files.
Currently I just assume no one is installing this with the global flag because I've "hard coded" the path to these dependencies in the npm module itself
example layout of my npm module
/
* /bin
* /lib/main.js
* /vendor/ember.js
Now inside main.js I want to use the ember.js file ... currently my hard coded approach looks like this
var emberjs = fs.readFileSync('node_modules/django-ember-precompile/vendor/ember.js', 'utf8');
Again -this only works because I assume you install it local but I'd like to think node.js has a more legit way to get locally embedded files
Anyone know how I can improve this to be more "global" friendly?
What you can do is get the directory of the current file and make your file paths relative to that.
var path = require('path')
, fs = require('fs');
var vendor = path.join(path.dirname(fs.realpathSync(__filename)), '../vendor');
var emberjs = fs.readFileSync(vendor + '/ember.js', 'utf8');
Hope that helps!
One of the great strengths of Node.js is how quickly you can get up and running. The downside to this approach is that you are forced to fit the design patterns it was build around.
This is an example where your approach differs too much from Nodes approach.
Node expects everything in a module to be exposed from the modules exports, including templates.
Move the readFileSync into the django-ember-precompile module, then expose the returned value via a module export in lib/main.js.
Example:
package.json
{
"name": "django-ember-precompile",
"main": "lib/main.js"
}
lib/main.js
module.exports.ember = readFileSync('vendor/ember.js')
vendor/ember.js
You obtain your template via
var template = require('django-ember-precompile').ember
This example can be refined, but the core idea is the same.