'job complete' event isn't firing in Kue - node.js

I can't figure out what I'm doing wrong, perhaps somone can point it out. I'm trying to figure out why my 'job complete' event isn't firing.
var kue = require('kue'),
jobs = kue.createQueue();
var util= require('util');
var job = jobs.create('test', util.puts('123')).on('complete', function(){
console.log("Job complete");
}).on('failed', function(){
console.log("Job failed");
}).on('progress', function(progress){
process.stdout.write('\r job #' + job.id + ' ' + progress + '% complete');
});
Now when I run this on node it prints 123 but it doesn't say job complete.

This question is old, but there is still no solution on the internet for those, who encounters this problem even with placing save()... I've made out three steps for myself to solve the problem:
1. Make sure that you call save() method on your jobs AFTER you set handlers on them.
var job = queue.create('some process', some_args);
job.on('complete', function(result) {
console.log('complete');
}).on('failed', function(result) {
console.log('failed');
}).removeOnComplete(true).save();
P.S. It's also a good practice to remove jobs on complete, otherwise you'll overfill redis memory.
2. Make sure your handlers are alright.
I myself experimented with event handlers, trying to pass them several arguments. My 'failed' event handler accepted both error code and other data I passed through done(err, data) method. This was not right. So check the documentation and official Kue examples to make sure your code isn't bugged.
3. If nothing helps, execute redis-cli flushall in your terminal.
And BEWARE!!! This will delete everything in your redis. I'm myself noob in it, so it is used only as a dependency for Kue on my system. I don't know definitely, but I suppose that this can destroy your data you may use in redis. Though it somehow fixes the problem, when nothing more helps.
Everyone, please, feel free to suggest any other secure ways to fix Kue with redis.
P.S. Haven't check, but I suppose, that changing process name for your jobs (it's 'some process' in my example) can also workaround the problem.

I think you need to run job.save() after the .create is executed.

As #James mentions you must call .save() after the event handlers has been set.
See the example.

Related

Forever Node.js Script Hangs Up on Loop

I have made a Node.js script which checks for new entries in a MySQL database and uses socket.io to send data to the client's web browser. The script is meant to check for new entries approximately every 2 seconds. I am using Forever to keep the script running as this is hosted on a VPS.
I believe what's happening is that the for loop is looping infinitely (more on why I think that's the issue below). There are no error messages in the Forever generated log file and the script is "running" even when it's started to hang up. Specifically, the part of the script that hangs up is the script stops accepting browser requests at port 8888 and doesn't serve the client-side socket.io js files. I've done some troubleshooting and identified a few key components that may be causing this issue, but at the end of the day, I'm not sure why it's happening and can't seem to find a work around.
Here is the relevant part of the code:
http.listen(8888,function(){
console.log("Listening on 8888");
});
function checkEntry() {
pool.getConnection(function(err,connection) {
connection.query("SELECT * FROM `data_alert` WHERE processtime > " + (Math.floor(new Date() / 1000) - 172800) + " AND pushed IS NULL", function (err, rows) {
connection.release();
if (!err) {
if(Object.keys(rows).length > 0) {
var x;
for(x = 0; x < Object.keys(rows).length; x++) {
connection.query("UPDATE `data_alert` SET pushed = 1 WHERE id = " + rows[x]['id'],function() {
connection.release();
io.emit('refresh feed', 'refresh');
});
}
}
}
});
});
setTimeout(function() { checkEntry();var d = new Date();console.log(d.getTime()); },1000);
}
checkEntry();
Just a few interesting things I've discovered while trouble shooting...
This only happens when I run the script on Forever. Work's completely fine if I use shell and just leave my terminal open.
It starts to happen after 5-30 minutes of running the script, it does not immediately hang up on the first execution of the checkEntry function.
I originally tried this with setInterval instead of setTimeout, the issue has remained exactly the same.
If I remove the setInterval/setTimeout function and run the checkEntry function only once, it does not hang up.
If I take out the javascript for loop in the checkEntry function, the hang ups stop (but obviously, that for loop controls necessary functionality so I have to at least find another way of using it).
I've also tried using a for-in loop for the rows object and the performance is exactly the same.
Any ideas would be immensely helpful at this point. I started working with Node.js just recently so there may be a glaringly obvious reason that I'm missing here.
Thank you.
So I just wanted to come back to this and address what the issue was. It took me quite some time to figure out and it can only be explained by my own inexperience. There is a section to my script where my code contained the following:
app.get("/", (request, response) => {
// Some code to log things to the console here.
});
The issue was that I was not sending a response. The new code looks as follows and has resolved my hang up issues:
app.get("/", (request, response) => {
// Some code to log things to the console here.
response.send("OK");
});
The issue had nothing to do with the part of the code I presented in the initial question.

Nightmare doesn't run twice in a row - NodeJS

EDIT
I have noticed the removal of the .end() function appears to solve the issue, but after reading the Nightmare docs on the use of .end() it says: Completes any queue operations, disconnect and close the electron process.
Now while this does solve the problem, am I now just opening more and more electron processes each time the route is called, which will eventually cause the server to run out of memory, or is this a safe way to fix the issue?
ORIGINAL TEXT
Please consider the following problem:
I am developing a Node based service that will allow the user to request screenshot of a particular URL.
For this I am using Nightmare to visit the URL, wait 2 seconds, take a screenshot, which is saved to the disk, convert it to base64, delete the image and then return the base64 string.
console.log('Nightmare starts');
nightmare
.goto(url)
.wait(2000)
.screenshot(filename)
.end()
.then(function (result)
{
fs.exists(filename, function(exists)
{
if (exists)
{
data = fs.readFileSync(filename);
var base64 = data.toString('base64')
fs.unlink(filename);
var output = {'message':'success','map_image':base64};
res.send(output);
}
});
})
.catch(function (error)
{
console.error('Search failed:', error);
});
console.log("Nightmare Finished");
The above code works just fine, the first time it runs. However any subsequent calls to this just consoles "Nightmare starts" and "Nightmare Finished" instantly with the actual code in-between not running. I don't appear to have any errors display, nothing is caught if I wrap it in a try/catch. The node requires a reboot to allow it to happen again.
Something worth noting is that I am running on a headless ubuntu machine, as electron (one of the nightmare dependencies) appears to need a GUI, I am using xvfb to launch the node using the following command:
xvfb-run --auto-servernum --server-num=1 node server.js
I'm assuming this may be an issue with some resource not being released correctly on the first run, but any assistance would be appreciated.
Also open to any constructive criticism of my code, very new to Node and i'm sure i'm not writing in the most optimal way (sync file loading etc)
It appears that you are simply misplacing where you are creating the nightmare instances. Cannot help much without some more code snippet and information.
Way 1
Create nightmare instance every time and close them after you are done with your task. It will require some time to boot up the instance, but it will also lessen the memory load. Not to mention you can have multiple nightmare instances for different users.
Way 2
Don't end and re-use same nightmare instance. Have multiple nightmare instances and queue the call for screenshot. The websites will load fast and it won't take time to boot up an instance, but you will have longer wait time for longer queue.

How does async work in Express?

I found the following on the ExpressJS guide:
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'dbuser',
password : 's3kreee7'
});
connection.connect();
connection.query('SELECT 1 + 1 AS solution', function(err, rows, fields) {
if (err) throw err;
console.log('The solution is: ', rows[0].solution);
});
connection.end();
Isn't this supposed to be bad practice? The way I see it, it is possible for the connection to end before the query can be executed. Wouldn't that give an error?
As stated here :
Every method you invoke on a connection is queued and executed in sequence.
Closing the connection is done using end() which makes sure all remaining queries are executed before sending a quit packet to the mysql server.
So even though the call to the end() method can be made before the query has completed, it won't actually be executed until the query has finished executing.
This has to do more with the mysql package than NodeJS itself.
Your question How does async work in Express? and Isn't this supposed to be bad practice? can be answered in many ways, but for clarity I would like to explain that It depends !!!!
It generally is very bad practice, assuming you don't know the actual implementation.
If the the implementation is really simple, where it does exactly what you ask -- i.e. closes or ends the connection when end is executed then it could lead to rather ugly race conditions where it may or may not work depending on the load of the machines.
However, a clever implementation that does reference counting -- that is the end does not actually close the connection but just sets a flag to say -- "when last callback is done then close" -- then it may work.
If the mysql connector it implemented using reference counting then this may well work fine -- but that is not the same as saying that it is good practice for everything you find as a plugin.

Issue with Zombie wait function

I am currently trying to implement following code with Zombie.js. Yet, I am unable to make the following code to work:
var Browser = require('zombie');
browser = new Browser();
browser.wait(3000, function() { console.log("ok"); });
So, the script should wait 3 seconds before displaying "ok". Yet, it displays it immediately.
Am I misunderstanding something?
Thanks for your help!
As the documentation states:
Waits for the browser to complete loading resources and processing
JavaScript events.
Since you're not requesting anything, there's nothing to wait for, so Zombie calls the callback immediately. It's more of a maximum timeout kind of thing, not a guaranteed wait.

Getting a node.js process to die?

I'm trying to find the most elegant way for my node.js app to die when something happens. In my particular case, I have a config file with certain require parameters that have to be met before the server can start and be properly configured.
One way I have found to do this is:
var die = function(msg){
console.log(msg)
process.exit(1);
}
die('Test end');
Is there a better way to handle this kind of situation?
better use console.error if you are doing process.exit immediately after.
console.log is non-blocking and puts your message into write queue where it is not processed because of exit()
update: console.log also blocks in latest versions (at least since 0.8.x).
If you want to abruptly exit then this will do just fine. If you want do any clean up you should do that first after which node.js will probably stop anyway, because nothing keeps event loop running.

Resources