Proc::Daemon with mod_perl does not write STDOUT or STDERR - linux

I have am using Proc::Daemon in a mod_perl script thusly:
$bindir, $ddir are executable/logfile locations and $jid is a unique per process identifier (to avoid having the same file open by multiple processes). $cmd is loaded with arbitrary perl scripts and arguments.
my $daemon = Proc::Daemon->new (
work_dir => $bindir,
child_STDOUT => $ddir.'/'.$jid.'_stdout.log',
child_STDERR => $ddir.'/'.$jid.'_stderr.log',
pid_file => $ddir.'/'.$jid.'_pid.txt',
exec_command => $cmd,
);
my $pid = $daemon->Init();
The above works fine when using Apache with cgi-script (no mod_perl). In the "$cmd" process, print and print STDERR log to the above defined log files.
When I run the above using mod_perl2, on Ubuntu Linux 14.04 LTS, using Apache2 The pid file gets written with the PID and the above log files are created, but nothing is ever written to the log files. I am able to open new files descriptors within $cmd and write to them, but under mod_perl, it is not sending output to the child_STDOUT and child_STDERR files.
I think I am missing something really obvious. Has anyone else seen this before, and or have any suggestions in getting this to work in mod_perl.
Additional info
Using the mpm_prefork module in Apache
Relevant Apache Config
<Files "*.pl">
# SetHandler cgi-script # It works if I use cgi-script
SetHandler perl-script
PerlOptions -SetupEnv # Tried with and without this
# PerlHandler ModPerl::RegistryPrefork # Using this made no difference
PerlHandler ModPerl::Registry
</Files>

OK so there were so many solutions that did not work This is what I wound up doing. I created a script called launchjob.pl:
#!/usr/bin/perl -w
use strict;
use warnings;
use POSIX 'setsid';
use Proc::Daemon;
my $bindir = $ARGV[0];
my $ddir = $ARGV[1];
my $jid = $ARGV[2];
my $cmd = $ARGV[3];
setsid or die "Cannot start a new session: $!";
my $daemon = Proc::Daemon->new (
work_dir => $bindir,
child_STDOUT => $ddir.'/'.$jid.'_stdout.log',
child_STDERR => $ddir.'/'.$jid.'_stderr.log',
pid_file => $ddir.'/'.$jid.'_pid.txt',
exec_command => $cmd,
);
my $pid = $daemon->Init();
exit 1;
I replaced the code, in the mainline, where I was calling Proc::Daemon with the following:
The following did not work. Proc::Daemon gave me an sh: No such file or directory error.
system('launchjob.pl', $bindir, $ddir, $jid, $cmd);
Instead I used the following which seems to run as expected.
use IPC::Run3;
run3("launchjob.pl ".$bindir." ".$ddir." ".$jid." ".$cmd);
This seems to have fixed it right up.

Related

Perl - Script working in Padre, Windows. But not in Linux Ubuntu

I've tried to transfer my script over that I've been working on in Windows. But it fails at the first hurdle.
use strict;
use warnings;
my $keywordFile = 'keyword.txt';
open(keyWords, $keywordFile) or die "$keywordFile not found\n";
my #keywordArray;
while ( my $line = <keyWords> ) {
chomp $line;
push #keywordArray, $line;
}
close(keyWords);
It keeps on dying, even though in the same destination there is a file called 'keyword.txt'. Is the issue coming from Ubuntu, or is my Perl wrong?
It keeps on dying, even though in the same destination there is a file called 'keyword.txt'
Relative paths are resolved against working directory (i.e. the directory script is run from), not the directory in which script is located.
You should either
Run your script from the directory that contains all needed files
Use FindBin module to get script location and use that path to refer to file names:
eg.
use FinBin;
my $keywordFile = "$FindBin::Bin/keyword.txt";

Installation script in Perl not functioning correctly

I have a program that gets installed using the following Perl script. The installation does not work and I get the message"No installer found." Obviously, nothing was done as the script just simply dies.
Here is the Perl install script (it is for installing a program called Simics):
#!/usr/bin/perl
use strict;
use warnings;
# Find the most recent installer in the current working directory.
my $installer;
my $highest_build = 0;
opendir my $d, "." or die $!;
foreach (readdir $d) {
if (-f && -x && /^build-(\d+)-installer/) {
if ($1 > $highest_build) {
$highest_build = $1;
$installer = $_;
}
}
}
closedir $d;
die "No installers found.\n" unless defined $installer;
exec "./$installer", #ARGV;
Stepping through your code above, this line:
foreach (readdir $d) {
reads the name of each of the files in the directory you opened to the handle "$d" and assigns each of those files in turn to the thing variable ($). (This variable is one of those weird but brilliant Perl idiosyncrasies. You don't have to mention $ in most cases; it's just there.)
Then in the next line:
if (-f && -x && /^build-(\d+)-installer/) {
The "-f" and the "-x" are file test operators. Since neither one has an explicit argument (e.g., -f "myfile.txt") they will use the implied thing variable, $_. The -f operator just checks to see if something is a file and the -x checks to see if the file is executable, (as indicated by the executable bit being set.) The third part, /^build-(\d+)-installer/, checks to see if it matches that pattern.
As you mentioned in your comment above, the directory listing shows
-rw------- 1 nikk nikk 52238 Feb 27 20:50 build-4607-installer.pl
The rw------- shows the file permissions for each of three groups, the owner ("nikk") and the group that owns the file (second "nikk"). The first three characters, starting with rw-, show that nikk can read and write from the file - but not execute. The listing would show rwx if nikk could execute the file. The next two groups of three characters, --- and --- show that neither the group nikk nor anyone else on the machine can read, write, or execute.
More information on Unix file system permissions
The lack of execute permission is causing the "-x" test to fail. There are two ways of fixing this. Either remove the -x from the if test so that it looks like this:
if (-f && /^build-(\d+)-installer/) {
Or add execute permission to the file. To do that just for the owner (assuming your program is running as user nikk or as root, do this:
chmod u+x build-4607-installer.pl
More information on chmod.
I hope that's helpful!

FastCGI with perl - On shared Linux webhost

I am trying to build an online "live chat" service, and for many reasons I found FastCGI to be suitable for that (as per its documentation), but I cannot seem to get it running.
I am using shared hosting with Apache 2.2 with mod_fcgid installed.
My .htaccess file has the following line added:
AddHandler fcgid-script .fcgi
My perl test script named fcgitest.fcgi is as follows:
#!/usr/bin/perl
# fcgitest.fcgi
use diagnostics;
use warnings;
use strict;
use CGI;
use CGI::Carp 'fatalsToBrowser'; # tester only!
use FCGI;
my %env; my $in = new IO::Handle; my $out = new IO::Handle; my $err = new IO::Handle;
my $request=FCGI::Request($in, $out, $err, \%env);
if($request->IsFastCGI()==0) {
print "Content-Type: text/plain\n\n"; binmode STDOUT; print "ERR"; exit 0;
}
my $tm=time();
while($request->Accept() >= 0) {
my $env=$request->GetEnvironment();
print "Content-Type: text/plain\n\n"; binmode STDOUT;
print time()." ".$env;
if(time()>($tm+60)) { $request->Finish(); exit 0; }
}
print "Content-Type: text/plain\n\n"; binmode STDOUT; print "---"; exit 0;
When I call this script from within one of my pages, I getting Internal Server Error, code 500, with NO explanation and NO error log in the server log file.
I tried to hide all the code and leave only the print statement, the problem remains the same.
I tried moving the file into the fcgi-bin directory, but the problem remains.
I have checked that the perl module is well installed.
I have no idea what can cause this error, as my hosting supplier says the server is well-configured for FCGI...
what shared hosting are you using? Most shared hosting have fcgi already installed, and you don't need to test the fcgi module on your own.
For example, on godaddy shared hosting, .fcgi/.fpl and even my .pl files would run over FCGI instead of normal CGI. No extra effort.
try different file permissions like 644, 700, 750, 755 for the script you are running.
Also, try adding the line:
print "Content-type:text/html\n\n";
after use CGI::Carp line.

GetAttributes uses wrong working directory in subthread

I used File::Find to traverse a directory tree and Win32::File's GetAttributes function to look at the attributes of files found in it. This worked in a single-threaded program.
Then I moved the directory traversal into a separate thread, and it stopped working. GetAttributes failed on every file with "The system cannot find the file specified" as the error message in $^E.
I traced the problem to the fact that File::Find uses chdir, and apparently GetAttributes doesn't use the current directory. I could work around this by passing it an absolute path, but then I could run into path length limits, and long paths are definitely going to be present where this script will run, so I really need to take advantage of chdir and relative paths.
To demonstrate the problem, here is a script which creates a file in the current directory, another file in a subdirectory, chdir's to the subdirectory, and looks for the file 3 ways: system("dir"), open, and GetAttributes.
When the script is run without arguments, dir shows the subdirectory, open finds the file in the subdirectory, and GetAttributes returns its attributes successfully. When run with --thread, all the tests are done in a subthread, and the dir and open still work, but the GetAttributes fails. Then it calls GetAttributes on the file that is in the original directory (which we have chdir'ed out of) and it finds that one! Somehow GetAttributes is using the original working directory of the process - or maybe the working directory of the main thread - unlike all the other file operations.
How can I fix this? I can guarantee that the main thread won't do any chdir'ing, if that matters.
use strict;
use warnings;
use threads;
use Data::Dumper;
use Win32::File qw/GetAttributes/;
sub doit
{
chdir("testdir") or die "chdir: $!\n";
system "dir";
my $attribs;
open F, '<', "file.txt" or die "open: $!\n";
print "open succeeded. File contents:\n-------\n", <F>, "\n--------\n";
close F;
my $x = GetAttributes("file.txt", $attribs);
print Dumper [$x, $attribs, $!, $^E];
if(!$x) {
# If we didn't find the file we were supposed to find, how about the
# bad one?
$x = GetAttributes("badfile.txt", $attribs);
if($x) {
print "GetAttributes found the bad file!\n";
if(open F, '<', "badfile.txt") {
print "opened the bad file\n";
close F;
} else {
print "But open didn't open it. Error: $! ($^E)\n";
}
}
}
}
# Setup
-d "testdir" or mkdir "testdir" or die "mkdir testdir: $!\n";
if(!-f "badfile.txt") {
open F, '>', "badfile.txt" or die "create badfile.txt: $!\n";
print F "bad\n";
close F;
}
if(!-f "testdir/file.txt") {
open F, '>', "testdir/file.txt" or die "create testdir/file.txt: $!\n";
print F "hello\n";
close F;
}
# Option 1: do it in the main thread - works fine
if(!(#ARGV && $ARGV[0] eq '--thread')) {
doit();
}
# Option 2: do it in a secondary thread - GetAttributes fails
if(#ARGV && $ARGV[0] eq '--thread') {
my $thr = threads->create(\&doit);
$thr->join();
}
Eventually, I figured out that perl is maintaining some kind of secondary cwd that only applies to perl built-in operators, while GetAttributes is using the native cwd. I don't know why it does this or why it only happens in the secondary thread; my best guess is that perl is trying to emulate the unix rule of one cwd per process, and failing because the Win32::* modules don't play along.
Whatever the reason, it's possible to work around it by forcing the native cwd to be the same as perl's cwd whenever you're about to do a Win32::* operation, like this:
use Cwd;
use Win32::FindFile qw/SetCurrentDirectory/;
...
SetCurrentDirectory(getcwd());
Arguably File::Find should do this when running on Win32.
Of course this only makes the "pathname too long" problem worse, because now every directory you visit will be the target of an absolute-path SetCurrentDirectory; try to work around it with a series of smaller SetCurrentDirectory calls and you have to figure out a way to get back where you came from, which is hard when you don't even have fchdir.

How can I change the current directory in a thread-safe manner in Perl?

I'm using Thread::Pool::Simple to create a few working threads. Each working thread does some stuff, including a call to chdir followed by an execution of an external Perl script (from the jbrowse genome browser, if it matters). I use capturex to call the external script and die on its failure.
I discovered that when I use more then one thread, things start to be messy. after some research. it seems that the current directory of some threads is not the correct one.
Perhaps chdir propagates between threads (i.e. isn't thread-safe)?
Or perhaps it's something with capturex?
So, how can I safely set the working directory for each thread?
** UPDATE **
Following the suggestions to change dir while executing, I'd like to ask how exactly should I pass these two commands to capturex?
currently I have:
my #args = ( "bin/flatfile-to-json.pl", "--gff=$gff_file", "--tracklabel=$track_label", "--key=$key", #optional_args );
capturex( [0], #args );
How do I add another command to #args?
Will capturex continue die on errors of any of the commands?
I think that you can solve your "how do I chdir in the child before running the command" problem pretty easily by abandoning IPC::System::Simple as not the right tool for the job.
Instead of doing
my $output = capturex($cmd, #args);
do something like:
use autodie qw(open close);
my $pid = open my $fh, '-|';
unless ($pid) { # this is the child
chdir($wherever);
exec($cmd, #args) or exit 255;
}
my $output = do { local $/; <$fh> };
# If child exited with error or couldn't be run, the exception will
# be raised here (via autodie; feel free to replace it with
# your own handling)
close ($fh);
If you were getting a list of lines instead of scalar output from capturex, the only thing that needs to change is the second-to-last line (to my #output = <$fh>;).
More info on forking-open is in perldoc perlipc.
The good thing about this in preference to capture("chdir wherever ; $cmd #args") is that it doesn't give the shell a chance to do bad things to your #args.
Updated code (doesn't capture output)
my $pid = fork;
die "Couldn't fork: $!" unless defined $pid;
unless ($pid) { # this is the child
chdir($wherever);
open STDOUT, ">/dev/null"; # optional: silence subprocess output
open STDERR, ">/dev/null"; # even more optional
exec($cmd, #args) or exit 255;
}
wait;
die "Child error $?" if $?;
I don't think "current working directory" is a per-thread property. I'd expect it to be a property of the process.
It's not clear exactly why you need to use chdir at all though. Can you not launch the external script setting the new process's working directory appropriately instead? That sounds like a more feasible approach.

Resources