Mongodb Shell command execution to drop collection via nodejs - node.js

I have a test setup in which mongoimport and mongoexportcommands are used to populate an exiting mongoDB database, say testDB from folder testDump.
The problem occurs for the files which are empty in the folder from which testDB is initially populated and then restored.
Eg. a collection file called abcInstance.json is empty in the testDump.
$ cat abcInstance.json
[]
Now when I run some test, this collection gets populated in testDB but at the end when I restore all collections from the testDump folder using mongoimport command it fails for empty files.
So, I am trying to drop those collections using mongo and spawn command.
if (statSync(collectionFile).size === 4) {
const options = [
'testDB',
'--eval',
'"db.abcInstance.drop()"'
];
const dropDB = spawn('mongo', options, { stdio: 'inherit' });
if (dropDB.status !== 0) {
throw new Error('failed to drop collection ');
}}
But this is also failing and I cannot figure out the error.
I have tested that the same command runs successfully on command line:
$ mongo testDB --eval "db.abcInstance.drop()"
MongoDB shell version v3.6.4
connecting to: mongodb://127.0.0.1:27017/alyneKickStartDB
MongoDB server version: 3.6.4
true
Any idea where I am going wrong?

So, I was able to solve the problem of executing the mongo command by a slightly different approach as stated here.
Basically, the problem I figured out was that my parent process was exiting without waiting for the child process to finish execution as both spawn and exec are asynchronous functions. So I modified my code as follows:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec)
async func test () {
const res = await exec('mongo testDB --eval "db.abcInstance.drop()" --quiet')
return { res }
}
Now, when I call this test() function, my collection is successfully dropped.
Does anybody know of any problem with this approach or a better way?

Related

How to access container id in NodeJs application in host network mode

I am trying to get a docker container's Id when network=host settings is enabled but instead of getting the containerId I am getting the host instance name. But, in case my network=host is not passed in the command it gives me the containerId as expected.
In short:
Case 1- I run my container with command – docker run --network="host" -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **docker-desktop**
Case 2- I run my container with command – docker run -d -it myservice:1.0
const os = require("os");
console.log(os.hostname()) /// prints **67db4w32k112** as expected
Is there a way I can get the same output i.e 67db4w32k112 in case 1 as well?
From looking at this thread you can probably do something like below which will read the /proc/1/cpuset file inside the container. This file has the current container ID, the contents look like:
/docker/7be92808767a667f35c8505cbf40d14e931ef6db5b0210329cf193b15ba9d605
This will be more reliable in your case than using os.hostname() since it works both with and without the --newtwork="host"flag on the docker run command.
fs = require('fs')
fs.readFile('/proc/1/cpuset', 'utf8', function(err, data) {
if (err) {
return console.log(err);
}
let containerID = data.replace("/docker/", "");
console.log(containerID);
});
Try to use a helper package such as docker-container-id
Add the dependency in your package.json
npm install --save docker-container-id
Here's an example:
const getId = require('docker-container-id');
async function() {
console.log("I'm in container:", await getId());
}
npmjs reference

why i am not able to update env variable in node js

I want to update my env variable in node js, but i am not able to update its env variable, i tried with console.log(process.env.DB_HOST) but still i am getting old value, here i have added my whole code, can anyone please look in to it, and help me to resolve this issue,
function exec_api() {
return new Promise(async function (resolve) {
const execSync = require('child_process').exec;
//let child_process_obj = execSync('DB_HOST='+process.env.UNITTEST_DB_HOST+' DB_DATABASE='+process.env.UNITTEST_DB_DATABASE+' DB_USERNAME='+process.env.UNITTEST_DB_USERNAME+' DB_PASSWORD='+process.env.UNITTEST_DB_PASSWORD+' PORT='+process.env.UNITTEST_SERVICE_PORT+' ./node_modules/.bin/nodemon main.js');
await execSync('export DB_HOST=' + process.env.UNITTEST_DB_HOST);
await execSync('export DB_DATABASE=' + process.env.UNITTEST_DB_DATABASE);
await execSync('export DB_USERNAME=' + process.env.UNITTEST_DB_USERNAME);
await execSync('export DB_PASSWORD=' + process.env.UNITTEST_DB_PASSWORD);
await execSync('export PORT=' + process.env.UNITTEST_API_BACKEND_PORT);
let child_process_obj = await execSync('node main.js');
unittest_api_backend_process_id = child_process_obj.pid;
resolve(true);
});
}
TLDR: Just change process.env
To change, add or delete environment variables, use process.env. The following is test code showing how this works:
In main.js:
console.log(process.env.DB_DATABASE);
In exec.js:
const execSync = require ('child_process').execSync;
process.env.DB_DATABASE = 'foo'; // this is ALL you need to do
console.log(execSync('node main.js').toString('utf8'));
With the two files above, if you run node exec.js you will see foo printed out in the console. This is printed from main.js which inherits the environment from exec.js.
So all you need to do in your code is:
I want to update my env variable in node js, but i am not able to update its env variable, i tried with console.log(process.env.DB_HOST) but still i am getting old value, here i have added my whole code, can anyone please look in to it, and help me to resolve this issue,
function exec_api() {
return new Promise(function (resolve) {
const exec = require('child_process').exec;
// The following is node.js equivalent of bash "export":
process.env.DB_HOST = process.env.UNITTEST_DB_HOST;
process.env.DB_DATABASE = process.env.UNITTEST_DB_DATABASE;
process.env.DB_USERNAME = process.env.UNITTEST_DB_USERNAME;
process.env.DB_PASSWORD = process.env.UNITTEST_DB_PASSWORD;
process.env.PORT = process.env.UNITTEST_SERVICE_PORT;
let child_process_obj = exec('node main.js', {
stdio: ['inherit', 'inherit', 'inherit']
});
unittest_api_backend_process_id = child_process_obj.pid;
resolve(true);
});
}
Note that if you want the promise to return when the main.js ends you need to do:
function exec_api() {
return new Promise(function (resolve) {
const exec = require('child_process').exec;
// The following is node.js equivalent of bash "export":
process.env.DB_HOST = process.env.UNITTEST_DB_HOST;
process.env.DB_DATABASE = process.env.UNITTEST_DB_DATABASE;
process.env.DB_USERNAME = process.env.UNITTEST_DB_USERNAME;
process.env.DB_PASSWORD = process.env.UNITTEST_DB_PASSWORD;
process.env.PORT = process.env.UNITTEST_SERVICE_PORT;
let child_process_obj = exec('node main.js', {
stdio: ['inherit', 'inherit', 'inherit']
});
unittest_api_backend_process_id = child_process_obj.pid;
child_process_obj.on('exit', () => resolve(true));
// ^^^ Cannot use `await` as the API is not promise based
// but event based instead.
});
}
Long story: The full explanation of why export doesn't work
On unixen, environment variables, and indeed, the entire environment including current working directory, root directory (which can be changed via chroot) etc. are not features of shells. They are features of processes.
We may be familiar with the export syntax of some shells to set environment variables for child processes but that is the shell's syntax. It has nothing to do with environment variables themselves. C/C++ for example don't use export instead uses the setenv() function do set environment variables (indeed, internally that's what bash/sh/ksh etc do when implementing export).
In node.js, the mechanism for reading and setting environment variables is via process.env.
Why asking a shell to do it don't work
This is not merely a node.js issue. It also won't work in bash:
In exporter.sh:
#! /bin/bash
export DB_DATABASE=$1
In exec.sh:
#! /bin/bash
./exporter.sh foo
echo $DB_DATABASE ;# does not print "foo"
This is a core security feature of unixen: other users should not be allowed to mess with your process. The way this policy is enforced in the case of the environment is that only a parent process can set the environment of the child process. A child process is not allowed to set the environment of the parent process. The assumption is that the child process belongs to the parent process so you should be allowed to do what you want to a program - but since the parent process (you) don't belong to the child process the child is not allowed to mess with the parent's environment.
That's why your attempt to use export doesn't work. It actually works (the variables are indeed created in the subshell) but is not allowed to change the environment of it's parent (the node.js process)
When you use export in a terminal, it instructs the shell to set environment variables.
When you call exec from your code, you are not running such a shell, with the reason being that it would become a challenge to extract the output of every command.
This makes export an ignored command.
You can solve this by passing an option object to execSync:
execSync('node main.js', {
env: {
DB_HOST: 'localhost',
// More envs...
}
}

Commander.js - Implementing sub commands that executes when the previous one is finished

I'm using Commander.js to write my own CLI. I managed to write commands that work individually but now I need to implement sub commands but the docs are a bit vague and confusing so I haven't been able to figure out.
What I want is the connect command to connect to a MongoDB instance and when it has done that proceed to execute the get command. How can I achieve this?
These are the commands and package.json:
./package.json:
{
...
"main": "./commands/my-cli.js",
"bin": "./commands/my-cli.js",
...
}
./commands/my-cli.js:
const commander = require('commander');
const program = new commander.Command();
const connect = require('./my-cli-connect');
const get = require('./my-cli-get');
// Initialize each command
connect(program);
get(program);
./commands/my-cli-connect.js:
function connect(program) {
program
.command('connect <db> <username> <password>', 'Connects to a database')
.action((db, username, password) => {
MongoClient.connect(<some-mongo-url>, {useNewUrlParser: true}, (err, connection) => {
assert.equal(null, err, 'Failed to connect to MongoDB instance');
// Continue here with the get command
});
});
program.parse(process.argv);
}
module.exports = connect;
./commands/my-cli-get.js:
function get(program) {
program
.command('get <collection>')
.option('-q,--query <query>', 'Search terms', jsonParser, {})
.description('Returns documents from a MongoDB collection')
.action(action);
program.parse(process.argv);
function action(collection, options) {
// This never runs
console.log('hello world');
}
}
module.exports = get;
Running my-cli --help shows these available commands:
...
Commands:
connect <db> <username> <password> Connects to a database
help [cmd] display help for [cmd]
Example command execution that should call both connect and then get when connect has finished connecting:
$ my-cli connect myDb myUser myPass get users -q '{"email": "foo#gmail.com"}'
Right now the get command's action function never runs.

Knex migration not working when using migration API

I am new to knex migrations and for the past 2 days I have been struggling to get it working but nothing happen. I am trying to run my migrations programmatically using the knex.migration object.
First using the cli, I create a migration file in the migrations directory. Here is its content:
exports.up = function(knex, Promise) {
return Promise.all([
knex.schema.createTable('users', function (table) {
table.increments('id').primary();
table.string('username');
table.string('password');
table.string('email');
table.string('name');
table.timestamp('date');
}),
]);
};
exports.down = function(knex, Promise) {
};
Then from my code I initialize the Knex object:
var knex = Knex({
client:'sqlite3',
connection:{
filename: './knex.sqlite'
}
});
Then I execute the migration:
knex.migrate.latest().then(()=>{
// console.log()
}).catch(err =>{
//
});
But absolutely nothing happens. My migration file is never executed and there is no error or warning message. So I don't know where to look at to start searching for the problem. When I look at my sqlite database, I can see that tables knex_migrations, knex_migrations_lock and sqlite_sequence have been created.
So what I am doing wrong here? Is there something I am missing?
Thanks for any suggestion
There's no requirement to use the CLI tools. Sometimes it's not possible to use it due to its limitations and in this case it's indeed possible to use the migration API directly, like so:
const knex = require('knex')({
// Here goes the Knex config
});
const migrationConfig = {
directory: __dirname + './migrations',
}
console.info('Running migrations in: ' + migrationConfig.directory);
knex.migrate.latest(migrationConfig).then(([batchNo, log]) => {
if (!log.length) {
console.info('Database is already up to date');
} else {
console.info('Ran migrations: ' + log.join(', '));
}
// Important to destroy the database, otherwise Node script won't exit
// because Knex keeps open handles.
knex.destroy();
});
There was two issues in the original question:
The migration directory was not specified - in this case Knex is not smart and will simply not do anything instead of throwing an error. Most likely the default used by Knex is not right so it's best to specify it.
knex.destroy() was missing. In this case, the script will never exit because Knex keeps open handles on the database, so it just looks like it's stuck doing nothing.
The above script also outputs more log info to see what's exactly happening.
Knex migrations are supposed to run by Knex CLI,FYI: https://knexjs.org/#Migrations
As your code described, I found a strange issue:
knex.migrate is actually undefined, it's not a property of knex.

Initializing postgres db with sequelize

I'm writing a command line utility to initialize a postgres database by dropping all tables, creating the postgis extension, then initialize my models.
The extension needs to be created because my models depend on it.
I'd like to know the "Sequelize way" to do this. For example, would I do this in a seeder then call sequelize db:seed ?
The SQL looks like this:
-- Drop all tables
drop schema public cascade;
create schema public;
-- Add PostGIS support
CREATE EXTENSION postgis;
// Edit: Rereading your question I see I missed the explicit request for the Sequelize way to do this - the below might be helpful but is probably not relevant.
I have wrestled with this problem in the apps I've built. The simplest solution (below) is a shell script with a JS script for the Sequelize calls. In other apps I use a number of Python classes along with a couple of JS scripts. This is crude, but flexible and it works.
run.sh:
#!/bin/bash
DB_NAME="sqpg"
DB_USER="sequelize"
DB_PW="sequelize"
DB_SCHEMA="s00"
export BLUEBIRD_DEBUG=1
sudo -u postgres psql -c "DROP SCHEMA IF EXISTS $DB_SCHEMA CASCADE" "$DB_NAME"
sudo -u postgres psql -c "CREATE SCHEMA $DB_SCHEMA AUTHORIZATION $DB_USER;" "$DB_NAME"
sudo -u postgres psql -c "CREATE EXTENSION \"uuid-ossp\" SCHEMA $DB_SCHEMA;" "$DB_NAME"
node populate-db.js
populate-db.js:
'use strict';
var Sequelize = require('sequelize');
sq_options = { /* ... */ };
var sq = new Sequelize('sqpg', 'sequelize', 'sequelize', sq_options);
var models = {
Foo: sq.define('Foo', {
/* ... * /
},
Bar: sq.define('Bar', {
/* ... */
}
};
models.Foo.belongsToMany(models.Bar, { through: 'foo_bar' });
models.Bar.belongsToMany(models.Foo, { through: 'foo_bar' });
sq.sync({ force: true })
.then(a_function_to_create_some_instances)
.then(a_function_to_create_some_more_instances)
.catch(function(err) {
console.warn('Rejected promise: ' + err);
console.warn(err.stack);
})
.finally(function() {
sq.close();
})

Resources