I've written a simple service using redis to store data in memory or fetch from disc and then store in memory. I'm trying to now control for rare cases where fetching to redis is slow. I've seen one example (https://gist.github.com/stockholmux/3a4b2d1480f27df8be67#file-timelimitedredis-js) which appears to solve this problem but I've had trouble implementing.
The linked implementation is:
/**
* returns a function that acts like the Redis command indicated by cmd except that it will time out after a given number of milliseconds
*
* #param {string} cmd The redis commmand to execute ('get','hset','sort', etc.)
* #param {integer} timeLimit The number of milliseconds to wait until returning an error to the callback.
*
*/
function timeLimited(cmd, timeLimit) {
return function() {
var
argsAsArr = Array.prototype.slice.call(arguments),
cb = argsAsArr.pop(),
timeoutHandler;
timeoutHandler = setTimeout(function(){
cb(new Error('Redis timed out'));
cb = function() {};
}, timeLimit);
argsAsArr.push(function(err, values){
clearTimeout(timeoutHandler);
cb(err,values);
});
client[cmd].apply(client,argsAsArr);
};
}
however I don't understand how to implement this because client is never defined and the the redis key/value are never passed in. Could someone explain a little about how one could go about implementing this example? I've been searching for more information or a working example but not had any luck so far. Thank you.
This isn't very clearly written but when you call it with cmd (eg. SET, HSET, etc) and time limit it returns a function. You call this returned function with the values. I don't know where client comes from, I guess you need to have it in scope. This isn't very good code, I would suggest posting what you've written and asking how to achieve what you want with that.
Related
I have an application which checks for new entries in DB2 every 15 seconds on the iSeries using IBM's idb-connector. I have async functions which return the result of the query to socket.io which emits an event with the data included to the front end. I've narrowed down the memory leak to the async functions. I've read multiple articles on common memory leak causes and how to diagnose them.
MDN: memory management
Rising Stack: garbage collection explained
Marmelab: Finding And Fixing Node.js Memory Leaks: A Practical Guide
But I'm still not seeing where the problem is. Also, I'm unable to get permission to install node-gyp on the system which means most memory management tools are off limits as memwatch, heapdump and the like need node-gyp to install. Here's an example of what the functions basic structure is.
const { dbconn, dbstmt } = require('idb-connector');// require idb-connector
async function queryDB() {
const sSql = `SELECT * FROM LIBNAME.TABLE LIMIT 500`;
// create new promise
let promise = new Promise ( function(resolve, reject) {
// create new connection
const connection = new dbconn();
connection.conn("*LOCAL");
const statement = new dbstmt(connection);
statement.exec(sSql, (rows, err) => {
if (err) {
throw err;
}
let ticks = rows;
statement.close();
connection.disconn();
connection.close();
resolve(ticks.length);// resolve promise with varying data
})
});
let result = await promise;// await promise
return result;
};
async function getNewData() {
const data = await queryDB();// get new data
io.emit('newData', data)// push to front end
setTimeout(getNewData, 2000);// check again in 2 seconds
};
Any ideas on where the leak is? Am i using async/await incorrectly? Or else am i creating/destroying DB connections improperly? Any help on figuring out why this code is leaky would be much appreciated!!
Edit: Forgot to mention that i have limited control on the backend processes as they are handled by another team. I'm only retrieving the data they populate the DB with and adding it to a web page.
Edit 2: I think I've narrowed it down to the DB connections not being cleaned up properly. But, as far as i can tell I've followed the instructions suggested on their github repo.
I don't know the answer to your specific question, but instead of issuing a query every 15 seconds, I might go about this in a different way. Reason being that I don't generally like fishing expeditions when the environment can tell me an event occurred.
So in that vein, you might want to try a database trigger that loads the key to the row into a data queue on add, or even change or delete if necessary. Then you can just put in an async call to wait for a record on the data queue. This is more real time, and the event handler is only called when a record shows up. The handler can get the specific record from the database since you know it's key. Data queues are much faster than database IO, and place little overhead on the trigger.
I see a couple of potential advantages with this method:
You aren't issuing dozens of queries that may or may not return data.
The event would fire the instant a record is added to the table, rather than 15 seconds later.
You don't have to code for the possibility of one or more new records, it will always be 1, the one mentioned in the data queue.
yes you have to close connection.
Don't make const data. you don't need promise by default statement.exec is async and handles it via return result;
keep setTimeout(getNewData, 2000);// check again in 2 seconds
line outside getNewData otherwise it becomes recursive infinite loop.
Sample code
const {dbconn, dbstmt} = require('idb-connector');
const sql = 'SELECT * FROM QIWS.QCUSTCDT';
const connection = new dbconn(); // Create a connection object.
connection.conn('*LOCAL'); // Connect to a database.
const statement = new dbstmt(dbconn); // Create a statement object of the connection.
statement.exec(sql, (result, error) => {
if (error) {
throw error;
}
console.log(`Result Set: ${JSON.stringify(result)}`);
statement.close(); // Clean up the statement object.
connection.disconn(); // Disconnect from the database.
connection.close(); // Clean up the connection object.
return result;
});
*async function getNewData() {
const data = await queryDB();// get new data
io.emit('newData', data)// push to front end
setTimeout(getNewData, 2000);// check again in 2 seconds
};*
change to
**async function getNewData() {
const data = await queryDB();// get new data
io.emit('newData', data)// push to front end
};
setTimeout(getNewData, 2000);// check again in 2 seconds**
First thing to notice is possible open database connection in case of an error.
if (err) {
throw err;
}
Also in case of success connection.disconn(); and connection.close(); return boolean values that tell is operation successful (according to documentation)
Always possible scenario is to pile up connection objects in 3rd party library.
I would check those.
This was confirmed to be a memory leak in the idb-connector library that i was using. Link to github issue Here. Basically there was a C++ array that never had it's memory deallocated. A new version was added and the commit can viewed Here.
I'm writing an app in Node and have been running into a rare but detrimental occurrence.
So I have a schedule.txt and I write to it when the user makes a change but then also read it every second and then parse it for use throughout the program.
Rarely what happens is as a user is writing to the file (asynchronously) the app (based on the timer) reads the same file and attempts to parse it and fails.
I know from a design stand-point maybe this is just bound to happen... but I'm wondering if there is a quick fix I can do now. Would using writeFileSync help my situation? (make it more 'atomic'?) I just want to make sure that the app doesn't read the file while another process is still writing to the file.
TIA!
Niko
Seems like you'd want to serialize your read/writes. If it were me, I might try having a "manager" object which encapsulates the serialization, which you'd use like:
var fileManager = require('./file-manager');
// somewhere in the program
fileManager.scheduleWrite(data, function(err){
// now the write is done
});
// somewhere else in the program
fileManager.scheduleRead(function(err, data){
// `data` contains the data
});
Then implement it using Q or a similar promises lib, like:
// in file-manager.js
var wait = Q();
module.exports = {
scheduleWrite: function(data, cb){
wait = wait.then(function(){
// write data and call cb()
});
},
scheduleRead: function(){
wait = wait.then(function(){
// read data and call cb(data)
});
}
};
The wait var will "stack up" into a serialized chain of tasks where the next one won't start until the previous one completes.
I'm parsing a large amount of files using nodejs. In my process, I'm parsing audio files, video files and than the rest.
The function to parse files looks like this :
/**
* #param arr : array of files objects (path, ext, previous directory)
* #param cb : the callback when every object is parsed,
* objects are then throwed in a database
* #param others : the array beeing populated by matching objects
**/
var parseOthers = function(arr, cb, others) {
others = others === undefined ? [] : others;
if(arr.length == 0)
return cb(others); //should be a nextTick ?
var e = arr.shift();
//do some tests on the element and add it
others.push(e);
//Then call next tested callImediate and nextTick according
//to another stackoverflow questions with no success
return parseOthers(arr, cb, others);
});
Full code here (care it's a mess)
Now with about 3565 files (not so much) the script catch a "RangeError: Maximum call stack size exceeded" exception, with no trace.
What have I tried :
I've tried to debug it with node-inspector and node debug script, but it never hangs as if it was running without debugging (does debugging increase the stack ?).
I've tried with process.on('uncaughtException') to catch the exception with no success.
I've got no memory leak.
How may I found an exception trace ?
Edit 1
Increasing the --stack_size seams to work pretty well. Isn't there another way of preventing this ?
(about 1300 there)
Edit 2
According to :
$ node --v8-options | grep -B0 -A1 stack_size
The default stack size (in kBytes) is 984.
Edit 3
A few more explanations :
I'm never reading this type of files itselves
I'm working here on an array of paths, I don't parse folders recursively
I'm looking at the path and checking if it's already stored in the database
My guess is that the populated array becomes to big for nodejs, but memory looks fine and that's weird...
Most Stackoverflow situations are not easy or sometimes possible to debug. Even if you debug on the problem, you may not find the trigger.
But I can suggest you a way to share the task load easily (including the queue management):
JXcore (a multithreaded fork on Node.JS) would suit better in your case. Simply create a task pool and set a task method handling 1 file at a time. It will manage your queue 1 by 1 multithreaded.
var myTask = function ( args here )
{
logic here
}
for(var i=0;i<LIST_OF_THE_FILES;i++)
jxcore.tasks.addTask( myTask, paramshere, optional callback ...
OR in case the logic definition is out of the scope of a single method;
var myTask = function ( args here )
{
require('mytasketc.js').handleTask(args here);
}
for(var i=0;i<LIST_OF_THE_FILES;i++)
jxcore.tasks.addTask( myTask, paramshere, optional callback ...
Remarks
Every single thread has its own V8 memory limit.
The context among the threads are separated
Make sure the task method closes the file in the end
Link
You can find more on multithreaded Javascript tasks
You getting this error because of recursion. Reformat your code to do not use it, especially because this peace of code really don't need it. Here is just APPROXIMATE example, to show you how better to do it:
var parseElems = function(arr, cb) {
var result = [];
arr.forEach(function (el) {
//do some tests on the element (el)
result.push(el);
});
cb(result);
});
I am designing a communication server in Node that handles incoming messages (sent by client1) and transfers them to someone else (client2), who answers the message and sends the answer back, via the server, to client1.
The communication happens via WebSockets, which implies an open connection from each client to the server.
Thus I implemented a ConnectionManager to which I can register any new connections when a new client comes online. Every connection gets assigned a messageQueue in which all incoming messages are cached before processing.
At the end of processing, I have a ServerTaskManager, who generates Output-Tasks for the server, telling him a message to send and a receiver to receive it.
This ServerTaskManager emits a Node-Event (inherits from EventEmitter) upon registering a new serverTask to which the server listens.
Now I would like my ConnectionManager to also listen to the event of the serverTaskManager, in order to make him push the next message in the messageQueue into processing.
Now the problem is, that I can catch the ServerTaskManager event within the ConnectionManager just fine, but, of course, the "this" within the listener is the ServerTaskManager, not the ConnectionManager. Thus calling any "this.someFunction()" functions that belong to the ConnectionManager won't work.
Here is some code:
/**
* ServerTaskManager - Constructor
* Implements Singleton pattern.
*/
function ServerTaskManager()
{
var __instance;
ServerTaskManager = function ServerTaskManager()
{
return __instance;
}
ServerTaskManager.prototype = this;
__instance = new ServerTaskManager();
__instance.constructor = ServerTaskManager;
return __instance;
}
util.inherits(ServerTaskManager, EventEmitter);
/**
* ConnectionManager - Constructor
* Also implements Singleton pattern.
*/
function ConnectionManager()
{
var __instance;
ConnectionManager = function ConnectionManager()
{
return __instance;
}
ConnectionManager.prototype = this;
__instance = new ConnectionManager();
__instance.constructor = ConnectionManager;
__instance.currentConnections = [];
// Listen for new serverInstructions on the serverTaskManager
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
this.processNextMessage(currentReceiver);
});
return __instance;
}
util.inherits(ConnectionManager, EventEmitter);
Now when I run this and the "newInstructions" event is triggered by the serverTaskManager, node throws:
TypeError: Object #<ServerTaskManager> has no method 'processNextMessage'
Which is of course true. The function I want to call belongs to the ConnectionManager:
/**
* Starts processing the next message
*
* #param connectionId (int) - The ID of the connection, of which to process the next message.
*/
ConnectionManager.prototype.processNextMessage = function (connectionId)
{
// Some Code...
}
So obviously, when listening to the ServerTaskManager event, "this" within the listener is the ServerTaskManager. Now how do I call my ConnectionManager's function from within the listener?
I hope I am not completely misled by how events and listeners and/or prototypical extensions work (in Node). This project is by far the most advanced that I have worked on in JavaScript. Normally I am only coding PHP with a little bit of client side JS.
Thx in advance for any hints!
Worp
Like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager.processNextMessage(currentReceiver);
});
Or like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager().processNextMessage(currentReceiver);
});
PS: your question is unnecessarily long. When posting code, don't necessarily post your example. It is much easier to boil your code down to the simplest form that exhibits the behavior you are seeing. You'll get more quality responses this way.
I am using kue for my job queue, and I'd like to know without using the GUI how many jobs are still left, how many have failed, etc. How can I retrieve this kind of information?
For example, after a few minutes of starting the processing of the job queue, I'd like to o update the status of all jobs that failed so far to 'inactive', in order to restart them.
The only related question I could find on stackoverflow was this, however, it deals with one job at a time, after it fires a certain event as it is being processed. My concern is different, as I am interested in retrieving all jobs in the database with a certain status.
The answer to this question mentions the function .complete of the kue library, which retrieves all the completed jobs in the database. Are there similar functions for other possible job statuses?
I found a solution by browsing the kue source code. The following code achieves what I need:
var redis = require ('redis'),
kue = require ('kue'),
redisClient = redis.createClient(6379, "127.0.0.1");
kue.redis.createClient = function () {
return redisClient;
};
kue.app.listen(3000);
kue.Job.rangeByType ('job', 'failed', 0, 10, 'asc', function (err, selectedJobs) {
selectedJobs.forEach(function (job) {
job.state('inactive').save();
});
});
For reference, here is the relevant kue source code:
/queue/job.js:123:
/**
* Get jobs of `type` and `state`, with the range `from`..`to`
* and invoke callback `fn(err, ids)`.
*
* #param {String} type
* #param {String} state
* #param {Number} from
* #param {Number} to
* #param {String} order
* #param {Function} fn
* #api public
*/
exports.rangeByType = function(type, state, from, to, order, fn){
redis.client().zrange('q:jobs:' + type + ':' + state, from, to, get(fn, order));
};
Kue source code indicating that:
type is the job type
from, to is the job ranges by index (for example, you can specify load jobs from index 0 to 10, 11 jobs in total.)
order specifies the order of fetched jobs. Default is asc. You can also sort it by desc
The following works, uses the pre-existing queue object and hence, no double Redis connection issue as mentioned by Japrescott in the comments of the accepted answer.
queue.cardByType("notifications", "complete", function( err, count ) {
console.log(count);
});
Feel free to replace with a valid state, the following is a list of valid states.
inactive
complete
active
failed
delayed