Electron: Perform sqlite (better-sqlite) db operations in another thread - node.js

I'm developing a desktop application using Electron framework and I've to use sqlite database for app data.
I decided to use better-sqlite3 because of:
Custom SQL function support (It's very important for me)
It's much faster than node-sqlite3 in most cases
It is simple to use.
It's synchronous API (in most cases I need to get data serialized)
but in some cases, when I perform a query that takes a while to response, the application UI won't responses to user until the query ends.
how can I run some db queries in another thread? or run them asyncronized (like node-sqlite3)?
sorry for bad english

Node allows you a separate process out-of-the-box. ( Threads are a different matter - alas no WebWorkers :( though you can prob find a thread add-on lib somwhere.
EDIT: Node has added worker_threads since I originally posted this answer. Haven't tried it yet / dunno if they work with better-sqlite.END EDIT
I've had the same issue as you - needing synchronous code to run without blocking the main thread and I used a child process. It was for better-sqlite too !
Problem is that how to handle io streams and sigints etc for control is not immediately obvious and differs depending on whether you're running on windows or posix.
I use a forked child process with silent option set to true to do the synchronous db work.
If you need control of that process or progress update reports back to your main process for your gui during sync ops ; I control/communicate with the child process by reading/writing on the child process stdin/out using fileSystem writeFileSync / readFileSync at various points in my child process code ( you can't use the normal inter-process comms api during sync ops as that's event driven and can't operate while synchronous code is running. Though you can mix and match the two types of io)
example of forked child process ;
//parent.js and child.js in same folder
//parent.js
process.on('exit', (code) => {
console.log(`Parent to exit with code: ${code}`);
});
const readLine = require("readline") ;
const cp = require('child_process');
var forkOptions = {
//execArgv:['--inspect-brk'], // uncomment if debugging the child process
silent:true // child gets own std pipes (important) whch are piped to parent
};
var childOptions = [] ;
const child = cp.fork(`./child.js`,childOptions,forkOptions);
//for messages sent from child via writeSync
const childChannel = readLine.createInterface({
input: child.stdout
}).on("line",function(input){
console.log("writeSync message received from child: " + input) ;
});
//for messages sent from child via process.send
child.on('message', (m) => {
console.log("process.send message received from child: " + m) ;
});
// Child.js
process.on('exit', (code) => {
console.log(`Child to exit with code: ${code}`);
});
const fs = require('fs');
function doSyncStuff(){
for(let i = 0 ; i < 20 ; i++){
//eg. sync db calls happening here
process.send(`Hello via process.send from child. i = ${i} \n`); // async commms . picked up by parent's "child.on" event
fs.writeFileSync(process.stdout.fd,`Hello via writeFileSync from child. i = ${i} \n`) ; // sync comms. picked up by parent's readLine listener ("process" here is the child )
}
}
doSyncStuff();

Related

Node.JS - Fork blocks parent loop execution

I'm trying to 'multi-thread' using fork and starting up a new process. The problem is once the program is executed and the first fork executes the whole parent code is blocked by the executing fork.
I thought fork just returns a process object?
Here is a stripped back version of the code I am using:
// I am looping a multi-dimensional array and sending an object to the fork IPC.
let array = ["a0", "a1", "a2"];
// Blocking Code
arr.forEach((arrayItem) => {
const forked = fork('./child.js');
// IPC code to send object...
// Capture IPC child messages
forked.on('message', (msg) => {
// Handle message...
});
});
It's worth adding what I want to happen for carity!
Using forEach I'd be able to create multiple child processes that all communicate over IPC without the parent execution block.

Nodejs child process exit before stdio streams close

I've just been experimenting with child processes and noticed that the exit event fires before the close event - the following code throws an error because this._instance.stdin no longer exists (this._instance is already null).
'use strict';
const spawn = require('child_process').spawn;
class Foo {
constructor() {
this._instance = null;
}
bar() {
this._instance = spawn('ls', ['-l']);
this._instance.on('close', (code, signal) => {
this._instance.stdin.end();
});
this._instance.on('exit', (code, signal) => {
this._instance = null;
});
return this._instance;
}
}
var foo = new Foo();
console.log(foo.bar());
The documentation states:
"Note that when the 'exit' event is triggered, child process stdio streams might still be open."
I wondered how this happens, why do the streams still exists after the process has exited? And how do they get 'closed', is this part handled by the OS or does node do the closing of the left over stdio streams?
In practice I wouldn't necessarily set this._instance to null on exit as it doesn't seem like a nice thing to do and is obviously a bit premature.
The documentation of the close event sheds some light on why this might happen.
In short, the stdio can be used by other processes which did not exit yet.
I haven't looked into the code itself, but it would make sense that the OS handles the part of closing the stdio streams. Think about piping stdio into multiple processes (piping with tee might be a good example).
In the presented case, I would suspect that you don't even need to end() stdin, as the close event suggests that the stdin stream has already been closed.

How to get child process to write to an http response in Node?

I have a big job that I fork to a child process. But I want the child process to handle the response instead of the main thread. So the child process generates a big old JSON object, but I don't want it to send it BACK to the main process. I just want it to send the response back itself.
function doWork(req,res) {
// CALL CHILD PROCESS, And have the child res.json(bigObject)
}
app.get('/dowork', doWork);
I'd like this pass the response ('res'), so that the child writes back to it. Is there a way to do this in Node?
Here's a simple express js server that echo's arguments by way of a child process:
const express = require('express')
var child = require("child_process");
const app = express()
const port = 3000
app.post('/echo', (req, res) => {
var cp = child.spawn(
"xargs", // subprocess
[ "echo", ], // arguments to subprocess
{"stdio":["pipe","pipe",process.stderr]}
);
cp.stdout.pipe(res)
req.pipe(cp.stdin);
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)
})
The only real trick ins knowing that in express.js, the request and response objects are readable / writable streams.
There is no way to do this currently. You used to be able to send file descriptors to child processes a couple of stable branches ago, but that functionality has since been removed. You could also set customFds at spawn time, but that has been deprecated for quite some time now.
Even if you could pass the socket to do the child process, you still would have to recreate the response object somehow.
Your best bet is to have the child send the parent the status code and any headers, set those on the response object in the parent, and then pipe the rest of the output from the child to the response (e.g. child.stdout.pipe(res);).

How can I execute a node.js module as a child process of a node.js program?

Here's my problem. I implemented a small script that does some heavy calculation, as a node.js module. So, if I type "node myModule.js", it calculates for a second, then returns a value.
Now, I want to use that module from my main Node.JS program. I could just put all the calculation in a "doSomeCalculation" function then do:
var myModule = require("./myModule");
myModule.doSomeCalculation();
But that would be blocking, thus it'd be bad. I'd like to use it in a non-blocking way, like DB calls natively are, for instance. So I tried to use child_process.spawn and exec, like this:
var spawn = require("child_process").spawn;
var ext = spawn("node ./myModule.js", function(err, stdout, stderr) { /* whatevs */ });
ext.on("exit", function() { console.log("calculation over!"); });
But, of course, it doesn't work. I tried to use an EventEmitter in myModule, emitting "calculationDone" events and trying to add the associated listener on the "ext" variable in the example above. Still doesn't work.
As for forks, they're not really what I'm trying to do. Forks would require putting the calculation-related code in the main program, forking, calculating in the child while the parent does whatever it does, and then how would I return the result?
So here's my question: can I use a child process to do some non-blocking calculation, when the calculation is put in a Node file, or is it just impossible? Should I do the heavy calculation in a Python script instead? In both cases, how can I pass arguments to the child process - for instance, an image?
I think what you're after is the child_process.fork() API.
For example, if you have the following two files:
In main.js:
var cp = require('child_process');
var child = cp.fork('./worker');
child.on('message', function(m) {
// Receive results from child process
console.log('received: ' + m);
});
// Send child process some work
child.send('Please up-case this string');
In worker.js:
process.on('message', function(m) {
// Do work (in this case just up-case the string
m = m.toUpperCase();
// Pass results back to parent process
process.send(m.toUpperCase(m));
});
Then to run main (and spawn a child worker process for the worker.js code ...)
$ node --version
v0.8.3
$ node main.js
received: PLEASE UP-CASE THIS STRING
It doesn't matter what you will use as a child (Node, Python, whatever), Node doesn't care. Just make sure, that your calculcation script exits after everything is done and result is written to stdout.
Reason why it's not working is that you're using spawn instead of exec.

Node.JS Parent Process ID

Is it possible to get the parent process-id using Node.JS? I would like to detect if the parent is killed or fails in such a way that it cannot notify the child. If this happens, the parent process id of the child should become 1.
This would be preferable to requiring the parent to periodically send a keep-alive signal and also preferable to running the ps command.
You can use pid-file. Something like that
var util = require('util'),
fs = require('fs'),
pidfile = '/var/run/nodemaster.pid';
try {
var pid = fs.readFileSync(pidfile);
//REPLACE with your signal or use another method to check process existence :)
process.kill(pid, 'SIGUSR2');
util.puts('Master already running');
process.exit(1);
} catch (e) {
fs.writeFileSync(pidfile, process.pid.toString(), 'ascii');
}
//run your childs here
Also you can send pid as argument in spawn() call
I start Node.JS from within a native OSX application as a background worker. To make node.js exit when the parent process which consumes node.js stdout dies/exits, I do the following:
// Watch parent exit when it dies
process.stdout.resume();
process.stdout.on('end', function() {
 process.exit();
});
Easy like that, but I'm not exactly sure if it's what you've been asking for ;-)

Resources