Is there any way of wiping a mongo database in between Nightwatch e2e tests?
I'm coming from Ruby, where you can configure RSpec to use a package called database cleaner and wipe your db after each test, and I'm wondering if a similar setup exists in the javascript ecosystem.
I did some research and found a promising-looking package called node-database-cleaner, but at present it is throwing an error.
Code:
require('mongodb');
const DatabaseCleaner = require('database-cleaner');
const databaseCleaner = new DatabaseCleaner('mongodb');
...test assertions
databaseCleaner.clean('mongodb://localhost:27017/my-db')
Error:
TypeError: db.collections is not a function
I'm not bound to using node-database-cleaner—I'd be interested in any solution no matter what library it uses.
Related
I'm attempting to write an Azure Function, in Node, to connect into a MongoDB instance (Cosmos DB in this case).
However, as soon I run require("mongodb"), my function crashes, without throwing an error, or logging anything, with the HTTP response returning a 502 code.
My setup:
Creating a function app using all defaults through the Azure portal.
Creating a package.json with mongodb version 3.x.
Running npm install through the Kudu shell
Include the require statement in my code.
Make a request to the function
This doesn't throw an error in the code, and I see logging that's run before, but not after the require statement (which is making it pretty difficult to debug).
I've also tried following through this guide about running a mongo query from a function, and it fails in exactly the same way for me.
After putting some hooks into Node's module module, my attempts to debug this led to a line in one of mongo's dependencies that fails in a similar way when run in isolation (from saslprep), which seems to stem from running out of stack space.
However, this feels like its a pretty mainstream use for an Azure function, and I haven't seen any similar issues, so I'm inclined to suspect that its an issue with my setup, rather than the mongodb library, but I haven't been able to find a misconfiguration, as I haven't changed any defaults - right now, I'm stumped!
I don't have a full code example right now, as I'm away from my work computer, but the code is something like
const mongo = require('mongodb');
module.exports = function(context) {
context.res = {
body: 'Hello world'
};
context.done();
}
Without the require statement, the code runs fine, returning the response to the browser.
It turns out that this problem was caused by running out of stack space. After pushing a patch to the saslprep library (v1.0.1), this has now been fixed.
Im pretty sure that if you add to your require function the same as in Microsofts Cosmos DB guides for mongo the following it should work
var mongodb = require('mongodb').MongoClient;
you have it as:
const mongodb = require('mongodb');
Im curious to know if that makes a difference. After looking through Microsofts own docs nearly all of them are declared that way.
Here is the tutorial I found: MongoDB app using Node.js
Situation
Hi, I'm quite new to Angular, I've been doing some projects following tutorials, which then lead me to try to start my own project to practice my Postgres and newly acquired Angular "skills".
I am trying to do a webapp that connects to a postgres DB using the node pg module.
(I know sequelize is a thing and it seems to work better than pg but AFAIK sequelize doesn't let you run pure postgres commands through it) Please correct me if I am wrong about this
The problem
This is where I get stuck, I am trying to follow the instructions from the docs but it doesn't seem to work correctly.
I have tried:
const { Client } = require('pg');
import { Client } from 'pg';
Also tried importing it in the .angular-cli.json in the scripts array
All of these fail with errors similar to this
ERROR in ./node_modules/pg/lib/connection-parameters.js Module not found: Error: Can't resolve 'dns' in '[...]\node_modules\pg\lib'
ERROR in ./node_modules/pg/lib/native/client.js Module not found: Error: Can't resolve 'pg-native' in '[...]\node_modules\pg\lib\native'
But nothing seems to work properly. Am I doing this completely wrong?
Also, pretty dumb question. I believe angular does everything on the client side, this is a HUGE security risk for DB access in prod. If that is true, is there a way to write server-side .ts services? or are services server-side?
You could write your serverside code in node using compiled ts, but probably not with angular.
I am fairly new to Node but I am loving the tool. My only problem is when I want to have direct access to the database. I have a good experience with ruby on rails+postgres. Using rails console was very helpful when I was developing rails.
Is there some kind of equivalent I can use to have direct access to my database? I have uploaded my app to heroku so I would like something that I can run on heroku as well.
(I prefer not to use SQL, I am wondering if there is a sequelize console?)
Here is the way to do it:
node --experimental-repl-await
> models = require('./models');
> User = models.User; //however you load the model in your actual app this may vary
> await User.findAll(); //use await to avoid promise errors
TLDR
This gives you access to all of the models you have created and you can use the sequelize ORM commands like findAll, create etc.. just as you would in Rails active record.
Sequelize uses promises, so to run these properly in REPL you will want to use the --experimental-repl-await flag
I am looking for MongoDB API compatible DB engine that does not require a full blown mongod process to run (kind of SQLite for Node).
From multiple candidates that persistently store data on a local disk with similar API ended up with two:
NeDB https://github.com/louischatriot/nedb
tingodb http://www.tingodb.com/
Problem
I have worked with neither of them.
I am also very new to the API of MongoDB, so it is difficult for me to judge about comparability.
Requirements
I need your help/advice on picking only one library that satisfies
It is stable enough.
It is fast to handle ~1Mb JSON documents on disk or bigger.
I want to be able to switch to MongoDB as a data backend in the future or by demand by changing a config file. I don't want to duplicate code.
DB initialization api is different
Now only tingodb claims the API compatibility. Even initialization looks fairly similar.
tingodb
var Db = require('tingodb')().Db, assert = require('assert');
vs
mongodb
var Db = require('mongodb').Db,
Server = require('mongodb').Server,
assert = require('assert');
In case of NeDB it looks a bit different because it uses the datastore abstraction:
// Type 1: In-memory only datastore (no need to load the database)
var Datastore = require('nedb')
, db = new Datastore();
QUESTION
Obliviously initialization is not compatible. But what about CRUD? How difficult it is to adopt it?
Since most of the code I do not want to duplicate will be CRUD operations, I need to know how similar they are, i.e. how agnostic can be my code about the fact which backend I have.
// If doc is a JSON object to be stored, then
db.insert(doc); // which is a NeDB method which is compatiable
// How about *WriteResult*? does not look like it..
db.insert(doc, function (err, newDoc) { // Callback is optional
// newDoc is the newly inserted document, including its _id
// newDoc has no key called notToBeSaved since its value was undefined
});
I will appreciate your insight in this choice!
Also see:
Lightweight Javascript DB for use in Node.js
Has anyone used Tungus ? Is it mature?
NeDB CRUD operations are upwards compatible with MongoDB, but initialization is indeed not. NeDB implements part of MongoDB's API but not all, the part implemented is upwards compatible.
It's definitely fast enough for your requirements, and we've made it very stable over the past few months (no more bug reports)
I'm writing functions for my node.js server using TDD(Mocha). For connecting to the database I'm doing
before(function(done){
db.connect(function(){
done();
});
});
and I'm running the test cases using make test and have configured my makefile to run all the js files in that particular folder using mocha *.js
But for each js file I'll have to make a separate connection to the database, otherwise my test cases fail since they do not share common scope with other test files.
So the question is, Is there anything like beforeAll() that would just simply connect once to the database and then run all the test cases? Any help/suggestion appreciated.
You can setup your db connection as a module that each of the Mocha test modules imports.
var db = require('./db');
A good database interface will queue commands you send to it before it has finished connecting. You can use that to your advantage here.
In your before call, simply do something that amounts to a no op. In SQL that would be something simple like a raw query of SELECT 1. You don't care about the result. The return of the query just signifies that the database is ready.
Since each Mocha module uses the same database module, it'll only connect once.
Use this in each of your test modules:
before(function(done) {
db.no_op(done);
});
Then define db.no_op to be a function that performs the no op and takes a callback function.