Capture node.js crash reason - node.js

I have a script written in node.js, it uses 'net' library and communicates with distant service over tcp. This script is started using 'node script.js >> log.txt' command and everything in that script that is logged using console.log() function gets written to log.txt but sometimes script dies and I cannot find a reason and nothing gets logged in log.txt around the time script crashed.
How can I capture crash reason?

Couldn't you listen to uncaughtException event. Something along the lines of =>
process.on('uncaughtException', function (err) {
console.log('Caught exception: ' + err);
});
P.S: after that you are adviced to restart process according to this article from Felix Geisendörfer

It's much easier to capture exceptions by splitting stdout and stderr. Like so:
node script.js 1> log.out 2> err.out
By default, node logs normal output to stdout, which I believe you are capturing with >>.
As noted in the comments, a segmentation fault is a signal put to stderr by the shell, not necessarily your program. See this unix.stackexchange answer for other options.

Related

Program to capture other's program stderr and stdout

I'd like to run a program as root that can intercept other's program stderr and stdout.
For example, say I start a nodejs server and somehow there's an error (with logs printed to stderr), if my program is running, I would like it to intercept this error.
Is that possible ? How should I do ?
Also, an idea that came to my mind was to replace nodejs binary by another one that starts nodejs and redirect stderr to a custom file. but I think it's too messy and I hope there's better ways to do that.
If you can control how nodejs is called you can redirect stderr to a named pipe and then read the named pipe from another command like this:
mkfifo /tmp/nodejs.stderr
nodejs 2>/tmp/nodejs.stderr
Then in some other shell type:
grep "Error Pattern" </tmp/nodejs.stderr
If you can't control how nodejs is called, then you can create a shell script to wrap those commands and call the shell script wherever nodejs is called.

Node, redirect console to a file

How can I redirect everything what is displayed in console to a file?
I mean for example, i call some function, this function display something on console (no metter if it is console.log or process.stdout.write)?
Thanks for help!
While not strictly a Node.js answer, you could achieve this at the shell level. For example if you are using bash, you could redirect both standard output and error stream to a file using the following
#!/bin/bash
node app.js &> output.log
Check out also tee command for simultaneous output to both file and screen.

Using node.js and coffeescript for executing Ubuntu commands

I am using child_process.exec to execute Ubuntu commands with node.js in coffeescript. When I execute the following commands:
list = child_process.exec("ls")
print list
It prints this:
[Object Object]
Why isn't a proper output of ls command printed? What should I do to get a proper output for commands?
You're attempting to run an asynchronous function synchronously. The correct way to do this is:
var exec = require('child_process').exec;
exec('ls', function (error, stdout, stderr) {
console.log(stdout);
});
Source: https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
If you really wish to execute a command synchronously, you can use execSync. However, I'd advise against that, since it blocks your node code from doing anything until the process finishes.
ExecSync: https://nodejs.org/api/child_process.html#child_process_child_process_execsync_command_options
Found it!
Can be accessed using ->
print list.main.<attribute_name>

End perl script without waiting for system call to return

I'm running a simple apache web server on Linux (Ubuntu 14.04) with a perl CGI script handling some requests. The script initiates a system command using the system function, but I want it to return immediately, regardless of the outcome of the system call.
I've been adding an ampersand to the end of the scalar argument passed to system (I am aware of the implications of command injection attacks) and although this does cause the system command to return immediately, the script will still not exit until the underlying command has completed.
If I trigger a dummy ruby script with a 10 second sleep using the system call from the perl CGI, then my request to the web server still waits 10 seconds before finally getting a response. I put a log statement after the system call and it appears immediately when the web request is made, so the system call is definitely returning immediately, but the script is still waiting at the end.
This question is similar, but neither of the solutions have worked for me.
Here's some example code:
#!/usr/bin/perl
use strict;
use warnings;
use CGI;
use Log::Log4perl qw(:easy);
Log::Log4perl->easy_init(
{ level => $DEBUG, file => ">>/var/log/script.log" } );
print "Content-type: application/json\n\n";
my $cgi = CGI->new();
INFO("Executing command...");
system('sudo -u on-behalf-of-user /tmp/test.rb one two &');
INFO("Command initiated - will return now...");
print '{"error":false}';
Edit:
The command call is executed using sudo -u because the apache user www-data needs permission to execute the script on behalf of the script owner, and I've updated my sudoers file appropriately to that end. This is not the cause of my issue, because I've also tried changing script ownership to www-data and running system("/tmp/test.rb one two &") but result is the same.
Edit 2:
I've also tried adding exit 0 to the very end of the script, but it doesn't make any difference. Is it possible that the script is exiting immediately, but the apache server is holding onto the response until the script the perl CGI called is finished? Or is it possible that some setting or configuration of the operating system is causing the problem?
Edit 3:
Running the perl CGI script directly from a terminal works correctly. The perl script ends immediately, so this is not an inherent issue with Perl. Which presumably can only mean that the Apache web server is holding onto the request until the command called from system is finished. Why?
The web server creates a pipe from which to receive the response. It waits for the the pipe to reach EOF before completing the request. A pipe reaches EOF when all copies of the writer handle are closed.
The writer end of the pipe is set as the child's STDOUT. That file handle was copied to be the shell's STDOUT, and again to the mycmd's STDOUT. So even though the CGI script and the shell ended and thus closed their ends of the file handle, mycmd still holds the handle open, so the web server is still waiting for the response to complete.
All you have to do with to close the last handle to the writer end of the pipe. Or more precisely, you can avoid making it in the first place by attaching a different handle to mycmd STDOUT.
mycmd arg1 arg2 </dev/null >/dev/null 2>&1 &

how to resolve the TCL script error when put it in crontab: "error writing "stdout": bad file number"?

I have a TCL script with function to write error log, but i meet the error as below when i put this script in crontab:
error writing "stdout": bad file number
while executing
"puts $msg"
the code pieces are:
if { $logLevel >= 0 } {
puts $msg
flush stdout
}
but this script can run succeed manually, it only have error when i put it in crontab.
thanks,
Emre
When you run a program from cron, it runs with an unusual environment. In particular, there is no terminal, the environment variables are different, neither stdin nor stdout are normally available, and stderr is redirected so it gets emailed to you if anything fails. As we can see from the error message in your case, stdout is not open (technically, it only says its not open for writing, but even so); puts defaults to writing there if not told otherwise.
The basic fix? Don't write to stdout! Open a file somewhere else and write to that. Alternatively, define a redirection of stdout in your crontab entry so that it goes somewhere definite (and is thus available for writing to from inside your Tcl program).

Resources