OpenBSD httpd write file out from within cgi script - openbsd

I want to write a file out from within a perl cgi script. This is obviously not possible in the standard configuration of httpd. The normal output works perfectly. What is to do?
#!/usr/bin/perl
print("Content-Type: text/html; charset=ascii\n\n");
print("hello world"); # works
# no error but not found in file system after the script finished
open(my $fh, ">", "out") or die $!;
print($fh "foo");
close($fh);
OpenBSD 6.1

Related

Proc::Daemon with mod_perl does not write STDOUT or STDERR

I have am using Proc::Daemon in a mod_perl script thusly:
$bindir, $ddir are executable/logfile locations and $jid is a unique per process identifier (to avoid having the same file open by multiple processes). $cmd is loaded with arbitrary perl scripts and arguments.
my $daemon = Proc::Daemon->new (
work_dir => $bindir,
child_STDOUT => $ddir.'/'.$jid.'_stdout.log',
child_STDERR => $ddir.'/'.$jid.'_stderr.log',
pid_file => $ddir.'/'.$jid.'_pid.txt',
exec_command => $cmd,
);
my $pid = $daemon->Init();
The above works fine when using Apache with cgi-script (no mod_perl). In the "$cmd" process, print and print STDERR log to the above defined log files.
When I run the above using mod_perl2, on Ubuntu Linux 14.04 LTS, using Apache2 The pid file gets written with the PID and the above log files are created, but nothing is ever written to the log files. I am able to open new files descriptors within $cmd and write to them, but under mod_perl, it is not sending output to the child_STDOUT and child_STDERR files.
I think I am missing something really obvious. Has anyone else seen this before, and or have any suggestions in getting this to work in mod_perl.
Additional info
Using the mpm_prefork module in Apache
Relevant Apache Config
<Files "*.pl">
# SetHandler cgi-script # It works if I use cgi-script
SetHandler perl-script
PerlOptions -SetupEnv # Tried with and without this
# PerlHandler ModPerl::RegistryPrefork # Using this made no difference
PerlHandler ModPerl::Registry
</Files>
OK so there were so many solutions that did not work This is what I wound up doing. I created a script called launchjob.pl:
#!/usr/bin/perl -w
use strict;
use warnings;
use POSIX 'setsid';
use Proc::Daemon;
my $bindir = $ARGV[0];
my $ddir = $ARGV[1];
my $jid = $ARGV[2];
my $cmd = $ARGV[3];
setsid or die "Cannot start a new session: $!";
my $daemon = Proc::Daemon->new (
work_dir => $bindir,
child_STDOUT => $ddir.'/'.$jid.'_stdout.log',
child_STDERR => $ddir.'/'.$jid.'_stderr.log',
pid_file => $ddir.'/'.$jid.'_pid.txt',
exec_command => $cmd,
);
my $pid = $daemon->Init();
exit 1;
I replaced the code, in the mainline, where I was calling Proc::Daemon with the following:
The following did not work. Proc::Daemon gave me an sh: No such file or directory error.
system('launchjob.pl', $bindir, $ddir, $jid, $cmd);
Instead I used the following which seems to run as expected.
use IPC::Run3;
run3("launchjob.pl ".$bindir." ".$ddir." ".$jid." ".$cmd);
This seems to have fixed it right up.

Perl - Script working in Padre, Windows. But not in Linux Ubuntu

I've tried to transfer my script over that I've been working on in Windows. But it fails at the first hurdle.
use strict;
use warnings;
my $keywordFile = 'keyword.txt';
open(keyWords, $keywordFile) or die "$keywordFile not found\n";
my #keywordArray;
while ( my $line = <keyWords> ) {
chomp $line;
push #keywordArray, $line;
}
close(keyWords);
It keeps on dying, even though in the same destination there is a file called 'keyword.txt'. Is the issue coming from Ubuntu, or is my Perl wrong?
It keeps on dying, even though in the same destination there is a file called 'keyword.txt'
Relative paths are resolved against working directory (i.e. the directory script is run from), not the directory in which script is located.
You should either
Run your script from the directory that contains all needed files
Use FindBin module to get script location and use that path to refer to file names:
eg.
use FinBin;
my $keywordFile = "$FindBin::Bin/keyword.txt";

syslog-ng perl pipe dropping events

I'm trying to pipe syslogs to a perl script via syslog-ng but not all of the syslog entries make it through - maybe 1 in 3 actually happen.
I've looked all over the place and can't find anyone remotely having the problem that I have. It seems so simple but I can't find the answer!
Here's my syslog-ng setup:
source s_1 { tcp(port(514)); };
destination d_zen { program("/tmp/zen.pl"); };
log { source(s_1); destination(d_zen); };
and here's my perl script:
#!/usr/bin/perl
use strict;
use warnings;
$|=1
my $filename = "/tmp/zen.log";
open(my $fh, '>>', $filename) or die "could not open file '$filename' $!";
while ( <STDIN> ) {
print $fh <STDIN>."\n";
};
any thoughts?
Is your Perl line buffer disabled?
According to the syslog-ng manual, it can cause some problems:
"Certain external applications buffer the log messages, which might cause unexpected latency and other problems. For example, if you send the log messages to an external Perl script, Perl uses a line buffer for terminal output and block buffer otherwise. You might want to disable buffering in the external application."
Also, I don't know how your script reads the incoming messages (I don't now perl), but I think it should use a loop to keep reading incoming messages.
So syslog-ng should start the script once at startup, and it should keep running and processing messages.
HTH,
Regards,
Robert Fekete
I figured out the problem. My while loop wasn't built properly:
#!/usr/bin/perl
$|=1;
use strict;
use warnings;
my $filename = "/tmp/zen.log";
open(my $fh, '>', $filename) or die "could not open file '$filename' $!";
my $my_string;
while( <> ) {
$my_string .= $_;
print $fh "$my_string\n";
};

FastCGI with perl - On shared Linux webhost

I am trying to build an online "live chat" service, and for many reasons I found FastCGI to be suitable for that (as per its documentation), but I cannot seem to get it running.
I am using shared hosting with Apache 2.2 with mod_fcgid installed.
My .htaccess file has the following line added:
AddHandler fcgid-script .fcgi
My perl test script named fcgitest.fcgi is as follows:
#!/usr/bin/perl
# fcgitest.fcgi
use diagnostics;
use warnings;
use strict;
use CGI;
use CGI::Carp 'fatalsToBrowser'; # tester only!
use FCGI;
my %env; my $in = new IO::Handle; my $out = new IO::Handle; my $err = new IO::Handle;
my $request=FCGI::Request($in, $out, $err, \%env);
if($request->IsFastCGI()==0) {
print "Content-Type: text/plain\n\n"; binmode STDOUT; print "ERR"; exit 0;
}
my $tm=time();
while($request->Accept() >= 0) {
my $env=$request->GetEnvironment();
print "Content-Type: text/plain\n\n"; binmode STDOUT;
print time()." ".$env;
if(time()>($tm+60)) { $request->Finish(); exit 0; }
}
print "Content-Type: text/plain\n\n"; binmode STDOUT; print "---"; exit 0;
When I call this script from within one of my pages, I getting Internal Server Error, code 500, with NO explanation and NO error log in the server log file.
I tried to hide all the code and leave only the print statement, the problem remains the same.
I tried moving the file into the fcgi-bin directory, but the problem remains.
I have checked that the perl module is well installed.
I have no idea what can cause this error, as my hosting supplier says the server is well-configured for FCGI...
what shared hosting are you using? Most shared hosting have fcgi already installed, and you don't need to test the fcgi module on your own.
For example, on godaddy shared hosting, .fcgi/.fpl and even my .pl files would run over FCGI instead of normal CGI. No extra effort.
try different file permissions like 644, 700, 750, 755 for the script you are running.
Also, try adding the line:
print "Content-type:text/html\n\n";
after use CGI::Carp line.

Read and Write Operation in perl script

I am Newbie to Perl script.
I want to do a read and write operation on a file. I will open a file in read and write mode (+<), and will write into a file. Now, I want read the file whatever I have written to it previously. Below is my code:
#!/usr/bin/perl
`touch file.txt`; #Create a file as opening the file in +< mode
open (OUTFILE, "+<file.txt") or die "Can't open file : $!";
print OUTFILE "Hello, welcome to File handling operations in perl\n"; #write into the file
$line = <OUTFILE>; #read from the file
print "$line\n"; #display the read contents.
When I am displaying the read contents it's showing a blank line. But the file "file.txt" has the data
Hello, welcome to File handling operations in perl
Why am I not able to read the contents. Whether my code is wrong or am I missing something.
The problem is that your filehandle position is located after the line you have written. Use the seek function to move the "cursor" back to the top before reading again.
An example, with some extra comments:
#!/usr/bin/env perl
# use some recommended safeguards
use strict;
use warnings;
my $filename = 'file.txt';
`touch $filename`;
# use indirect filehandle, and 3 argument form of open
open (my $handle, "+<", $filename) or die "Can't open file $filename : $!";
# btw good job on checking open sucess!
print $handle "Hello, welcome to File handling operations in perl\n";
# seek back to the top of the file
seek $handle, 0, 0;
my $line = <$handle>;
print "$line\n";
If you will be doing lots of reading and writing you may want to try (and not everyone suggests it) using Tie::File which lets you treat a file like an array; line access by line number (newline written automatically).
#!/usr/bin/env perl
# use some recommended safeguards
use strict;
use warnings;
use Tie::File;
my $filename = 'file.txt';
tie my #file, 'Tie::File', $filename
or die "Can't open/tie file $filename : $!";
# note file not emptied if it already exists
push #file, "Hello, welcome to File handling operations in perl";
push #file, "Some more stuff";
print "$file[0]\n";
This is a seemingly common beginner mistake. Most often you will find that reading and writing to the same file, while possible, is not worth the trouble. As Joel Berger says, you can seek to the beginning of the file. You can also simply re-open the file. Seeking is not as straightforward as reading line by line, and will present you with difficulties.
Also, you should note, that creating an empty file beforehand is not required. Simply do:
open my $fh, ">", "file.txt" or die $!;
print $fh "Hello\n";
open $fh, "<", "file.txt" or die $!;
print <$fh>;
Note that:
using open on the same file handle will automatically close it.
I use three-argument open, and a lexical (defined by my) file handle, which is the recommended way.
you do not need to add newline when printing a variable read in line by line mode, as it will already have a newline at the end. Or end of file.
You can use print <$fh>, as the print statement is in list context, it will extract all the lines from the file handle (print the entire file).
If you only want to print one line, you can do:
print scalar <$fh>; # put <$fh> in scalar context

Resources