copy values from sqlite3 db to a global array in node.js - node.js

i have use the node_sqlite3 module and i have try the following example:
var sqlite = require('sqlite3').verbose();
var db = new sqlite.Database("/var/www/signals/db/app3.db");
var matrixSignals = new Array();
var i;
i = 0;
db.all('SELECT * FROM tbl_signals', function(err,rows){
rows.forEach(function(row) {
matrixSignals[i] = new Object();
matrixSignals[i].signalID = row.signalID;
matrixSignals[i].connID = row.connID;
i++;
});
db.close();
console.log('1:' + matrixSignals.length);
});
console.log('2:' + matrixSignals.length);
in the console output 1 the length is correct but in the console output 2 the length is always 0. How i will set the matrixSignals as a global variable?

The reason this doesn't work has to do with how Node.js operates in general. In node, all code is executed asynchronously; at the time you are logging output 2, matrixSignals still has a length of 0. This is because after you fire off the database query, the code continues to execute. Logging output 1 is only executed after the database query has finished, which is why it returns the correct results.
For this reason, the simple answer to your question is that there is no way to set matrixSignals to be a global variable. If all of your logic is truly dependent on the values in that array, then your logic should be in the callback to the database call - so that it only executes that code once the data has been retrieved from the database. If you just want the syntax to be cleaner, you could potentially use something like node-promise (https://github.com/kriszyp/node-promise) but I think that's probably more effort than its worth.

Related

How to control serial and parallel control flow with mapped functions?

I've drawn a simple flow chart, which basically crawls some data from internet and loads them into the database. So far, I had thought I was peaceful with promises, however now I have an issue that I'm working for at least three days without a simple step.
Here is the flow chart:
Consider there is a static string array like so: const courseCodes = ["ATA, "AKM", "BLG",... ].
I have a fetch function, it basically does a HTTP request followed by parsing. Afterwards it returns some object array.
fetch works perfectly with invoking its callback with that expected object array, it even worked with Promises, which was way greater and tidy.
fetch function should be invoked with every element in the courseCodes array as its parameter. This task should be performed in parallel execution, since those seperate fetch functions do not affect each other.
As a result, there should be a results array in callback (or Promises resolve parameter), which includes array of array of objects. With those results, I should invoke my loadCourse with those objects in the results array as its parameter. Those tasks should be performed in serial execution, because it basically queries database if similar object exists, adds it if it's not.
How can perform this kind of tasks in node.js? I could not maintain the asynchronous flow in such a scenario like this. I've failed with caolan/async library and bluebird & q promise libraries.
Try something like this, if you are able to understand this:
const courseCodes = ["ATA, "AKM", "BLG",... ]
//stores the tasks to be performed.
var parallelTasks = [];
var serialTasks = [];
//keeps track of courses fetched & results.
var courseFetchCount = 0;
var results = {};
//your fetch function.
fetch(course_code){
//your code to fetch & parse.
//store result for each course in results object
results[course_code] = 'whatever result comes from your fetch & parse code...';
}
//your load function.
function loadCourse(results) {
for(var index in results) {
var result = results[index]; //result for single course;
var task = (
function(result) {
return function() {
saveToDB(result);
}
}
)(result);
serialTasks.push(task);
}
//execute serial tasks for saving results to database or whatever.
var firstSerialTask = serialTasks.shift();
nextInSerial(null, firstSerialTask);
}
//pseudo function to save a result to database.
function saveToDB(result) {
//your code to store in db here.
}
//checks if fetch() is complete for all course codes in your array
//and then starts the serial tasks for saving results to database.
function CheckIfAllCoursesFetched() {
courseFetchCount++;
if(courseFetchCount == courseCodes.length) {
//now process courses serially
loadCourse(results);
}
}
//helper function that executes tasks in serial fashion.
function nextInSerial(err, result) {
if(err) throw Error(err.message);
var nextSerialTask = serialTasks.shift();
nextSerialTask(result);
}
//start executing parallel tasks for fetching.
for(var index in courseCode) {
var course_code = courseCode[index];
var task = (
function(course_code) {
return function() {
fetch(course_code);
CheckIfAllCoursesFetched();
}
}
)(course_code);
parallelTasks.push(task);
for(var task_index in parallelTasks) {
parallelTasks[task_index]();
}
}
Or you may refer to nimble npm module.

Chaining nested asynchronous finds with Node.js monk and MongoDB

Using Node.js monk and MongoDB, I want to mimic a table join:
Do a find on collection A
For each result X, do a find in collection B, and update X
Return updated list of results
The asynchronous nature of database commands in monk is giving me trouble.
This is my initial code. It doesn't work because the second call to find returns a promise immediately,
and the results in xs are sent in the response before they can be updated.
var db = require('monk')('localhost/mydb');
db.get('collection').find({}, function(e,xs) {
xs.forEach(function(x){
coll_b.find({a_id:x._id}, function(e,bs) {
a['bs'] = bs;
});
});
res.json({'results':as});
});
I feel like I should use promise chaining here, but I cannot figure out how to do it.
Any help would be greatly appreciated.
I think I solved it in this way, inspired by this answer:
var db = require('monk')('localhost/mydb');
// Initial find
db.get('collection').find({}, function(e,xs) {
// Get inner finds as a list of functions which return promises
var tasks = xs.map(function(x){
return function() {
return coll_b.find({a_id:x._id}, function(e,bs) {
a['bs'] = bs;
});
}
});
// Chain tasks together
var p = tasks[0](); // start the first one
for(var i = 1; i < tasks.length; i++) p = p.then(tasks[i]);
// After all tasks are done, output results
p.then(function(_x){
res.json({'results':xs});
});
});
I still feel like this code could be minimised by using chain(), but at least this works as expected.
Note: I realise that performing a second find for each result is not necessarily efficient, but that's not my concern here.

NODE.js working with update query at Server end

QUESTION:
I am writing a node.js code on server which accepts multiple values from client on certain event
var table = data['table'];
var columnName = data['colName']
var columnValue = data['colValue']
var primary_id = data['pid']
var updateQuery = "UPDATE "+table+" SET "+columnName+"=? WHERE primary_id="+primary_id;
var query = conn.query(updateQuery, [columnValue] , function (err, result) {
if (err) throw err;
console.log('changed ' + result.changedRows + ' rows');
});
console.log(query.sql);
// This shows exact query which I wanted to RUN against my MySQL db and also executes successfully on my DB if I try to run this manually.
Problem:
Query formed is correct still that query is not executing through
NODE server.
NODE not showing any error thrown on anonymous call back function and not even result.
NOTE : I have tried to RUN simple select through NODE and that works perfectly [all inclusion are are made correctly for mysql
module and its connection object]
It would be great if some body put some lights on further how to debug this in terms of what kind of error its getting in back-end.
Its was silly mistake ... I was destroying connection each time.
So due to asynchronous nature of JS which indeed used in NODE execution was flowing down with out stopping for query to execute.
..Yes offcourse there is possibly option in NODE to use synchronous nature.
but for now when I removed that line it works like a charm.
//conn.destroy();

Call back methods with sqlite in node js

I'm trying to create a new program using node.js which reads SQLite DB and creates a XML document. I created some sub methods, and when I'm trying to return the DB result using a callback method its not passing anything.
Method to read and create xml part:
function dbpic(id,callback){
dd = db1.all("SELECT * FROM pictures left join item_pics on item_pics.picid=pictures.picid where item_pics.itemid = "+id+"", function(err1,rows1){
var ee = '';
rows1.forEach(function(row1) {
ee += '<picture>';
ee+='<picturename>'+row1.pic+'</picturename>';
ee+='<link>'+row1.url+'</link>';
ee += '</picture>';
});
console.log(ee);
callback( ee);
});
}
Calling this method using:
dbpic(row.iid,function(df){
ss += df;
});
Variable ss is going to display, but it gets nothing from callback method. Database contains relevant data and returning correct results results. I checked them using console.log.
When i logged df with console, it displays results correctly. but ss is not stored df's data.
According to your comments you have some troubles understanding the asynchronous concept. This how things happen in your code.
// STEP 1
ss = "initalvalue";
dbpic(row.iid,function(df){
// STEP 3, 4, 5, 6 ...
ss += df;
});
// STEP 2
ss += "ending";
You need to wait until the last callback is called until you display the results.

I'm having issues passing an array from my node.js app to my javascript client

I have live chat on my website and right now it's just polling. I want to get with the times and replace this with a node.js version. I've been making good progress but am now stuck on something that appears to just be a syntax issue.
So what I'm doing is when the user first comes to my site, I want to show them the most recent chat that's in the mysql database. So it will be rows of
time user their message
I'm sending this data on user connect from my app.js file with
io.sockets.on('connection', function (socket) {
//Make connection to mysql
connection.query('select values from my_table limit 5', function(err, rows, fields) {
user_info = new Array();
for (i = 0; i < rows.length; i++) {
user_info[i] = new Array();
user_info[i]['time'] = rows[i].time;
user_info[i]['user'] = rows[i].user;
user_info[i]['their_message'] = rows[i].their_message;
}
io.sockets.emit('some_event_tag', user_info);
});
So this works, except that when I try and access this data under the function associated with "some_event_tag", it appears my syntax is off because I'm not getting the information. So on the client, I'm trying to access the data with
some_event_tag: function(data) {
var msg = '';
for (i = 0; i < data.length; i++) {
msg = $('<div class="msg"></div>')
.append('<span class="name">' + data[i]['user'] + '</span>: ')
.append('<span class="text">' + data[i]['their_message'] + '</span>');
$('#messages')
.append(msg);
msg = '';
}
},
but for whatever reason I'm getting "undefined". On the server side, if I change
io.sockets.emit('some_event_tag', user_info);
to something like
io.sockets.emit('some_event_tag', user_info[0]['user']);
I can access this value on the client (by just saying "data"). Of course, in this case, I'm only passing one value. Not what I want. On the client side, I can also correctly see that five array elements are being passed. In other words, data.length is correctly set to 5. However, I can't actually figure out the syntax to access the data on the client side. Isn't this just typical javascript arrays? I'm a little stumped at this point and google didn't help me this time. Any help you guys can give would be greatly appreciated.
JavaScript has a rather strong assumption that Arrays use numeric keys between 0 and length-1. While they can still have other keys, most functions for handling Arrays will ignore them -- including JSON.stringify() that Socket.IO uses.
So, if you need to set other keys and don't want them skipped, you'll want to use a plain Object instead:
var user_info = new Array();
for (var i = 0; i < rows.length; i++) {
user_info[i] = new Object();
// ...
}
I suspect producing an SSCCE would highlight your issue, among others:
// The time is the user and the user is the time? ;)
user_info[i]['time'] = rows[i].user;
user_info[i]['user'] = rows[i].time;
If rows[i].time is something date-like, then user_info[i]['user'] is likely to be something date-like. A question arises as to whether or not io.sockets.emit can emit date-like things:
io.sockets.emit('some_event_tag', user_info[0]['user']);
There appears to be no documentation for io.sockets.emit. That's not entirely helpful, is it? Ohh well. I'm confident. Keep us updated, right?

Resources