I'm trying to pipe syslogs to a perl script via syslog-ng but not all of the syslog entries make it through - maybe 1 in 3 actually happen.
I've looked all over the place and can't find anyone remotely having the problem that I have. It seems so simple but I can't find the answer!
Here's my syslog-ng setup:
source s_1 { tcp(port(514)); };
destination d_zen { program("/tmp/zen.pl"); };
log { source(s_1); destination(d_zen); };
and here's my perl script:
#!/usr/bin/perl
use strict;
use warnings;
$|=1
my $filename = "/tmp/zen.log";
open(my $fh, '>>', $filename) or die "could not open file '$filename' $!";
while ( <STDIN> ) {
print $fh <STDIN>."\n";
};
any thoughts?
Is your Perl line buffer disabled?
According to the syslog-ng manual, it can cause some problems:
"Certain external applications buffer the log messages, which might cause unexpected latency and other problems. For example, if you send the log messages to an external Perl script, Perl uses a line buffer for terminal output and block buffer otherwise. You might want to disable buffering in the external application."
Also, I don't know how your script reads the incoming messages (I don't now perl), but I think it should use a loop to keep reading incoming messages.
So syslog-ng should start the script once at startup, and it should keep running and processing messages.
HTH,
Regards,
Robert Fekete
I figured out the problem. My while loop wasn't built properly:
#!/usr/bin/perl
$|=1;
use strict;
use warnings;
my $filename = "/tmp/zen.log";
open(my $fh, '>', $filename) or die "could not open file '$filename' $!";
my $my_string;
while( <> ) {
$my_string .= $_;
print $fh "$my_string\n";
};
Related
So I am new to Perl and trying to simply open a directory, and list all its files. When I run this very simple code below trying to print everything in /usr/bin it will not work, and no matter what I try I keep getting told 'Could not open /usr/bin: No such file or directory'.
Any help would be much appreciated!
#!/usr/bin/perl
$indir = "/usr/bin";
# read in all files from the directory
opendir (DIR, #indir) or die "Could not open $indir: $!\n";
while ($filename = readdir(DIR)) {
print "$filename\n";
}
closedir(DIR);
Here is another place where the very basic troubleshooting step of use strict; and use warnings; has been omitted, and it would have told you exactly what was wrong.
Global symbol "#indir" requires explicit package name (did you forget to declare "my #indir"?)
Of course, you'd also have to fix a few other errors (e.g. my $indir = '/usr/bin';)
I would also suggest that readdir is not well suited for this job, and would tend to recommend glob:
#!/usr/bin/env perl
use strict;
use warnings;
my $indir = "/usr/bin";
# read in all files from the directory
foreach my $filename ( glob "$indir/*" ) {
print "$filename\n";
}
Note how this differs - it prints a full path to the file, and it omits certain things (like . and ..) which is in my opinion, more generally useful. Not least because another really common error is to open my $fh, '<', $filename or die $!, forgetting that it's not in the current working directory.
I am trying to follow log files in Perl on Fedora but unfortunately, Fedora uses journalctl to read binary log files that I cannot parse directly. This, according to my understanding, means I can only read Fedora's log files by calling journalctl.
I tried using IO::Pipe to do this, but the problem is that $p->reader(..) waits until journalctl --follow is done writing output (which will be never since --follow is like tail -F) and then allows me to print everything out which is not what I want. I would like to be able to set a callback function to be called each time a new line is printed to the process pipe so that I can parse/handle each new log event.
use IO::Pipe;
my $p = IO::Pipe->new();
$p->reader("journalctl --follow"); #Waits for process to exit
while (<$p>) {
print;
}
I assume that journalctl is working like tail -f. If this is correct, a simple open should do the job:
use Fcntl; # Import SEEK_CUR
my $pid = open my $fh, '|-', 'journalctl --follow'
or die "Error $! starting journalctl";
while (kill 0, $pid) {
while (<$fh>) {
print $_; # Print log line
}
sleep 1; # Wait some time for new lines to appear
seek($fh,0,SEEK_CUR); # Reset EOF
}
open opens a filehandle for reading the output of the called command: http://perldoc.perl.org/functions/open.html
seek is used to reset the EOF marker: http://perldoc.perl.org/functions/seek.html Without reset, all subsequent <$fh> calls will just return EOF even if the called script issued additional output in the meantime.
kill 0,$pid will be true as long as the child process started by open is alive.
You may replace sleep 1 by usleep from Time::HiRes or select undef,undef,undef,$fractional_seconds; to wait less than a second depending on the frequency of incoming lines.
AnyEvent should also be able to do the job via it's AnyEvent::Handle.
Update:
Adding use POSIX ":sys_wait_h"; at the beginning and waitpid $pid, WNOHANG) to the outer loop would also detect (and reap) a zombie journalctl process:
while (kill(0, $pid) and waitpid($pid, WNOHANG) != $pid) {
A daemon might also want to check if $pid is still a child of the current process ($$) and if it's still the original journalctl process.
I have no access to journalctl, but if you avoid IO::Pipe and open the piped output directly then the data will not be buffered
use strict;
use warnings 'all';
open my $follow_fh, '-|', 'journalctl --follow' or die $!;
print while <$follow_fh>;
I want to use xinput to monitor # of keystrokes and # of mouse movement presses. For simplification let's say what I want is these two commands:
xinput test 0
xinput test 1
to write to the screen at the same time.
I am using this in a Perl script like:
open(my $fh, '-|', 'xinput test 0') or die $!;
while(my $line = <$fh>) {
...stuff to keep count instead of logging directly to file
}
EDIT:
something like:
open(my $fh, '-|', 'xinput test 0 & xinput test 1') or die $!;
doesn't work.
I'm not sure what you want to do with the output, but it sounds like you want to run the commands simultaneously. In that case, my first thought would be to fork the Perl process once per command and then exec the child processes to the commands you care about.
foreach my $command ( #commands ) { # filter #commands for taint, etc
if( fork ) { ... } #parent
else { # child
exec $command or die "Could not exec [$command]! $!";
}
}
The forked processes share the same standard filehandles. If you need their data in the parent process, you'd have to set up some sort of communication between the two.
There are also several Perl frameworks on CPAN for handling asynchronous multi-process stuff, such as POE, AnyEvent, and so on. They'd handle all these details for you.
If you want to write both command on the console simultaneously, simply run them on the background:
xinput test 0 &
xinput test 1 &
But first you have to make sure that the console is set into the regime which allows that, otherwise the background processes will get stopped when trying to write on console. This code will switch off the stty tostop option:
stty -tostop
Hi,
I am having an odd issue with my small script to interact with a serial port using Device::SerialPort.
My code is as follows:
use Device::SerialPort;
my $port = new Device::SerialPort("/dev/ttyACM0") or croak("no.... $!");
$port->baudrate(9600) || die "failed setting baudrate: $!";
$port->parity("none") || die "failed setting parity: $!";
$port->databits(8) || die "failed setting databits: $! ";
$port->stopbits(1) or die "failed setting stops: $!";
$port->handshake("none") || die "failed setting handshake: $!";
$port->write_settings or croak("Failed setting... everything: $!");
my $foo = $port->write("o0") or croak("Failed writing: $!");
die("WRITE FAILED!\n") unless $foo;
die("WRITE INCOMPLETE!\n") if $foo != 2;
$port->close() or die("close.... $!");
The odd thing is that if I do cat /dev/ttyACM before I run the perl code, it all works as expected.
I have tried adding open($fh, '<', '/dev/ttyACM0'); to open the port like cat should do, but alas, no luck.
What am I doing wrong?
Update: Ok.. something odd. If I set the serial port to 9600 with stty -F /dev/ttyACM0 raw speed 9600 and trying to echo o0 > /dev/ttyACM0 it doesnt work. But if I however do cat /dev/ttyACM0while doing the echo it works fine.
The device in the other end is an Arduino Mega, if that has something to do with this..
cat > /dev/ttyACM0 works in every situation, but not exactly what I want as cat never exits.
By "doesn't work," what do you mean? Does the write hang, or does the data disappear?
I suspect you're running afoul of some flow control setting. I'm not specifically familiar with the Arduino, but if it supports hardware flow control, then you might fiddle with those settings.
Adding < makes the file handle read, whereas you probably want it to be for write. You would therefore want to open it for write > or for append >>.
Furthermore, since the OS caches file buffers, you have to flush it. See here in the perdoc on how to do it.
I am Newbie to Perl script.
I want to do a read and write operation on a file. I will open a file in read and write mode (+<), and will write into a file. Now, I want read the file whatever I have written to it previously. Below is my code:
#!/usr/bin/perl
`touch file.txt`; #Create a file as opening the file in +< mode
open (OUTFILE, "+<file.txt") or die "Can't open file : $!";
print OUTFILE "Hello, welcome to File handling operations in perl\n"; #write into the file
$line = <OUTFILE>; #read from the file
print "$line\n"; #display the read contents.
When I am displaying the read contents it's showing a blank line. But the file "file.txt" has the data
Hello, welcome to File handling operations in perl
Why am I not able to read the contents. Whether my code is wrong or am I missing something.
The problem is that your filehandle position is located after the line you have written. Use the seek function to move the "cursor" back to the top before reading again.
An example, with some extra comments:
#!/usr/bin/env perl
# use some recommended safeguards
use strict;
use warnings;
my $filename = 'file.txt';
`touch $filename`;
# use indirect filehandle, and 3 argument form of open
open (my $handle, "+<", $filename) or die "Can't open file $filename : $!";
# btw good job on checking open sucess!
print $handle "Hello, welcome to File handling operations in perl\n";
# seek back to the top of the file
seek $handle, 0, 0;
my $line = <$handle>;
print "$line\n";
If you will be doing lots of reading and writing you may want to try (and not everyone suggests it) using Tie::File which lets you treat a file like an array; line access by line number (newline written automatically).
#!/usr/bin/env perl
# use some recommended safeguards
use strict;
use warnings;
use Tie::File;
my $filename = 'file.txt';
tie my #file, 'Tie::File', $filename
or die "Can't open/tie file $filename : $!";
# note file not emptied if it already exists
push #file, "Hello, welcome to File handling operations in perl";
push #file, "Some more stuff";
print "$file[0]\n";
This is a seemingly common beginner mistake. Most often you will find that reading and writing to the same file, while possible, is not worth the trouble. As Joel Berger says, you can seek to the beginning of the file. You can also simply re-open the file. Seeking is not as straightforward as reading line by line, and will present you with difficulties.
Also, you should note, that creating an empty file beforehand is not required. Simply do:
open my $fh, ">", "file.txt" or die $!;
print $fh "Hello\n";
open $fh, "<", "file.txt" or die $!;
print <$fh>;
Note that:
using open on the same file handle will automatically close it.
I use three-argument open, and a lexical (defined by my) file handle, which is the recommended way.
you do not need to add newline when printing a variable read in line by line mode, as it will already have a newline at the end. Or end of file.
You can use print <$fh>, as the print statement is in list context, it will extract all the lines from the file handle (print the entire file).
If you only want to print one line, you can do:
print scalar <$fh>; # put <$fh> in scalar context