Using mysql with express/Node: Is using globals ok? - node.js

I can easily enough attach a new connection within my express config to the database using:
global.db = new DB.adapter({});
Thereby I an access it throughout any models I may wish to create. (I am using active record and mysql with express).
However, is this in any way a) insecure, b) bad practice?

I don't think that this would be insecure, but using globals in general is bad practice:
Includes DB global for modules that don't need it. Doesn't matter too much, but is a bit unclean.
Tightly couples a specific DB instance with modules that do need it.
Makes testing more difficult
Makes it harder to change to a different DB for particular modules.
Instead it's much easier to just use CommonJS which is built into node and is very easy to use.
// db.js
module.exports = new DB.adapter({});
//index.js
var db = require("./db");

Both.
Insecure: Any module can change the global context. As any other package repository out there, evil programmers may exist in NPM.
Bad practice: Some test frameworks even consider this as a error (Mocha, for example, but that's configurable). You're in a CommonJS environment, so use modules! Doing so your connection object cannot be lost when require()ing that module, the far I know.

Related

How to use multiple db connection strings depending on the environment in Node.js - Express - Mongo?

During development, using Monk, in my app.js I define the db variable and make it accessible to the router as follows:
var db = monk('localhost:27017/dbname');
[...]
app.use(function(req,res,next){
req.db = db;
next();
});
Now since things seem to be working well, I would like to deploy my app to Heroku. For this purpose, I created an account on mLab to have a db to use in production.
I have set the environment variable MONGODB_URI as
heroku config:set MONGODB_URI=mongodb://user:password#server:port/dbname
and I think now I could use it now in this way:
var db = mongo.db(process.env.MONGOLAB_URI);
Now, I would like to keep my dev environment as it is and when I push to Heroku have instead the production DB. How can I do this?
Maybe it's not complicated but I don't have much experience with databases and production in general so I would appreciate some help.
Ps. Also I see that mLab states that sandbox dbs (the free plan) are not suitable for production. However, given also that my app doesn't make a heavy use of database (mostly just store some data and maybe in the future display something out of it), do you think it is OK if I use a sandbox db anyway? What problems could I encounter doing so?

MongoDB database cleaner for Nightwatch.js

Is there any way of wiping a mongo database in between Nightwatch e2e tests?
I'm coming from Ruby, where you can configure RSpec to use a package called database cleaner and wipe your db after each test, and I'm wondering if a similar setup exists in the javascript ecosystem.
I did some research and found a promising-looking package called node-database-cleaner, but at present it is throwing an error.
Code:
require('mongodb');
const DatabaseCleaner = require('database-cleaner');
const databaseCleaner = new DatabaseCleaner('mongodb');
...test assertions
databaseCleaner.clean('mongodb://localhost:27017/my-db')
Error:
TypeError: db.collections is not a function
I'm not bound to using node-database-cleaner—I'd be interested in any solution no matter what library it uses.

Console using postgres+sequelize with Node.js?

I am fairly new to Node but I am loving the tool. My only problem is when I want to have direct access to the database. I have a good experience with ruby on rails+postgres. Using rails console was very helpful when I was developing rails.
Is there some kind of equivalent I can use to have direct access to my database? I have uploaded my app to heroku so I would like something that I can run on heroku as well.
(I prefer not to use SQL, I am wondering if there is a sequelize console?)
Here is the way to do it:
node --experimental-repl-await
> models = require('./models');
> User = models.User; //however you load the model in your actual app this may vary
> await User.findAll(); //use await to avoid promise errors
TLDR
This gives you access to all of the models you have created and you can use the sequelize ORM commands like findAll, create etc.. just as you would in Rails active record.
Sequelize uses promises, so to run these properly in REPL you will want to use the --experimental-repl-await flag

Sails/Bookshelf Running a script that uses the Sails/Bookshelf environment

I would like to create a script to be run in the background of my server as a cron task.
I would like the script to have access to the sails environment (ie, loading all the modules, especially bookshelf and knex, and the database connection).
so that I could create a file myscript.js that looks something like
var environment = require("sails_environment")
// code that uses bookshelf etc exactly as if it were written
// inside a controller action
I actually only need the bookshelf module and db connection for this script, so it could be that bookshelf has a way to do this, but I imagine it is something built in to Sails.
There are lots of ways to do what you want. Here are a few.
https://github.com/balderdashy/sails/issues/2092#issuecomment-56043637
https://www.npmjs.com/package/sails-hook-schedule
http://www.worldnucleus.com/2014/12/run-cron-job-in-sailsjs.html

Is CRUD API of NeDB compatibale with MongoDB?

I am looking for MongoDB API compatible DB engine that does not require a full blown mongod process to run (kind of SQLite for Node).
From multiple candidates that persistently store data on a local disk with similar API ended up with two:
NeDB https://github.com/louischatriot/nedb
tingodb http://www.tingodb.com/
Problem
I have worked with neither of them.
I am also very new to the API of MongoDB, so it is difficult for me to judge about comparability.
Requirements
I need your help/advice on picking only one library that satisfies
It is stable enough.
It is fast to handle ~1Mb JSON documents on disk or bigger.
I want to be able to switch to MongoDB as a data backend in the future or by demand by changing a config file. I don't want to duplicate code.
DB initialization api is different
Now only tingodb claims the API compatibility. Even initialization looks fairly similar.
tingodb
var Db = require('tingodb')().Db, assert = require('assert');
vs
mongodb
var Db = require('mongodb').Db,
Server = require('mongodb').Server,
assert = require('assert');
In case of NeDB it looks a bit different because it uses the datastore abstraction:
// Type 1: In-memory only datastore (no need to load the database)
var Datastore = require('nedb')
, db = new Datastore();
QUESTION
Obliviously initialization is not compatible. But what about CRUD? How difficult it is to adopt it?
Since most of the code I do not want to duplicate will be CRUD operations, I need to know how similar they are, i.e. how agnostic can be my code about the fact which backend I have.
// If doc is a JSON object to be stored, then
db.insert(doc); // which is a NeDB method which is compatiable
// How about *WriteResult*? does not look like it..
db.insert(doc, function (err, newDoc) { // Callback is optional
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
});
I will appreciate your insight in this choice!
Also see:
Lightweight Javascript DB for use in Node.js
Has anyone used Tungus ? Is it mature?
NeDB CRUD operations are upwards compatible with MongoDB, but initialization is indeed not. NeDB implements part of MongoDB's API but not all, the part implemented is upwards compatible.
It's definitely fast enough for your requirements, and we've made it very stable over the past few months (no more bug reports)

Resources