i have a script, that parse log and find errors and warnings.
And i want to use user-friendly interpretation of this log.
For this reason, i use notepad.
Here is code:
use v5.16;
use strict;
use warnings;
use Win32::Clipboard;
use threads;
use utf8;
my $kp = Win32::Clipboard->new();
my $output = shift || "out_log.txt";
#my ($input, $output)=#ARGV;
#open my $ih, "<", $input or die "can't open input file with log\n";
open my $oh, ">", $output or die "can't open output file with log\n";
my #alls=split /\n/,$kp->Get();
for my $k(0..$#alls){
$_ = $alls[$k];
if(/^ERR.+|^WARN.+/){
print {$oh} qq(at position $k --> ).$_."\n";
}
}
my $thread =
threads->create(sub{
$SIG{INT}=sub{die"All good\n";};
qx(notepad $output);
}
);
print qq(type 'y' for quit);
do{
print "want to quit?\n>" ;
chomp;
do{
say "I will kill this thread";
$thread->kill('INT') if defined($thread);
say "and delete output";
unlink $output;
exit(0);
}if (m/y/);
}while(<>);
It falls down, when i trying to kill thread which run notepad.
How to do this, using signals and threads? Is it possible?
And your ideas about solution, please.
Thanks!
This isn't working because your SIGINT never gets passed to notepad. So it never gets closed. (And that handler - probably never gets processed).
You need to approach this differently. Look at Win32::Process for some examples of how to spawn/kill a notepad process.
my $ProcessObj;
Win32::Process::Create( $ProcessObj,
"C:\\Windows\\system32\\notepad.exe",
"notepad", 0, NORMAL_PRIORITY_CLASS, "." )
or die $!;
And then you can use
$ProcessObj -> Kill(1);
I'd suggest using Thread::Semaphore or some sort of shared variable to decide if you want to kill your notepad.
Related
the string to to be searched is:
the file_is being created_automaically {
period=20ns }
the perl script i am using is following ( this script is working fine for single line string but not working for multi line )
#!/usr/bin/perl
my $dir = "/home/vikas";
my #files = glob( $dir . '/*' );
#print "#files";
system ("rm -rf $dir/log.txt");
my $list;
foreach $list(#files){
if( !open(LOGFILE, "$list")){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: unable to open file";
close (File);
else {
while (<LOGFILE>){
if($_ =~ /".*the.*automaically.*\{\n.*period\=20ns.*\}"/){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: File contain the required string\n";
close (File);
break;
}
}
close (LOGFILE);
}
}
This code does not compile, it contains errors that causes it to fail to execute. You should never post code that you have not first tried to run.
The root of your problem is that for a multiline match, you cannot read the file in line-by-line mode, you have to slurp the whole file into a variable. However, your program contains many flaws. I will demonstrate. Here follows excerpts of your code (with fixed indentation and missing curly braces).
First off, always use:
use strict;
use warnings;
This will save you many headaches and long searches for hidden problems.
system ("rm -rf $dir/log.txt");
This is better done in Perl, where you can control for errors:
unlink "$dir/log.txt" or die "Cannot delete '$dir/log.txt': $!";
foreach my $list (#files) {
# ^^
Declare the loop variable in the loop itself, not before it.
if( !open(LOGFILE, "$list")){
open (File, ">>", "$dir/log.txt");
select (File);
print " $list \: unable to open file";
close (File);
You never have to explicitly select a file handle before you print to it. You just print to the file handle: print File "....". What you are doing is just changing the STDOUT file handle, which is not a good thing to do.
Also, this is error logging, which should go to STDERR instead. This can be done simply by opening STDERR to a file at the beginning of your program. Why do this? If you are not debugging a program at a terminal, for example via the web or some other process where STDERR does not show up on your screen. Otherwise it is just extra work while debugging.
open STDERR, ">", "$dir/log.txt" or die "Cannot open 'log.txt' for overwrite: $!";
This has the added benefit of you not having to delete the log first. And now you do this instead:
if (! open LOGFILE, $list ) {
warn "Unable to open file '$list': $!";
} else ....
warn goes to STDERR, so it is basically the same as print STDERR.
Speaking of open, you should use three argument open with explicit file handle. So it becomes:
if (! open my $fh, "<", $list )
} else {
while (<LOGFILE>) {
Since you are looking for a multiline match, you need to slurp the file(s) instead. This is done by setting the input record separator to undef. Typically like this:
my $file = do { local $/; <$fh> }; # $fh is our file handle, formerly LOGFILE
Next how to apply the regex:
if($_ =~ /".*the.*automaically.*\{\n.*period\=20ns.*\}"/) {
$_ =~ is optional. A regex automatically matches against $_ if no other variable is used.
You should probably not use " in the regex. Unless you have " in the target string. I don't know why you put it there, maybe you think strings need to be quoted inside a regex. If you do, that is wrong. To match the string you have above, you do:
if( /the.*automaically.*{.*period=20ns.*}/s ) {
You don't have to escape \ curly braces {} or equal sign =. You don't have to use quotes. The /s modifier makes . (wildcard character period) also match newline, so we can remove \n. We can remove .* from start or end of string, because that is implied, regex matches are always partial unless anchors are used.
break;
The break keyword is only used with the switch feature, which is experimental, plus you don't use it, or have it enabled. So it is just a bareword, which is wrong. If you want to exit a loop prematurely, you use last. Note that we don't have to use last because we slurp the file, so we have no loop.
Also, you generally should pick suitable variable names. If you have a list of files, the variable that contains the file name should not be called $list, I think. It is logical that it is called $file. And the input file handle should not be called LOGFILE, it should be called $input, or $infh (input file handle).
This is what I get if I apply the above to your program:
use strict;
use warnings;
my $dir = "/home/vikas";
my #files = glob( $dir . '/*' );
my $logfile = "$dir/log.txt";
open STDERR, ">", $logfile or die "Cannot open '$logfile' for overwrite: $!";
foreach my $file (#files) {
if(! open my $input, "<", $file) {
warn "Unable to open '$file': $!";
} else {
my $txt = do { local $/; <$fh> };
if($txt =~ /the.*automaically.*{.*period=20ns.*}/) {
print " $file : File contain the required string\n";
}
}
}
Note that the print goes to STDOUT, not to the error log. It is not common practice to have STDOUT and STDERR to the same file. If you want, you can simply redirect output in the shell, like this:
$ perl foo.pl > output.txt
The following sample code demonstrates usage of regex for multiline case with logger($fname,$msg) subroutine.
Code snippet assumes that input files are relatively small and can be read into a variable $data (an assumption is that computer has enough memory to read into).
NOTE: input data files should be distinguishable from rest files in home directory $ENV{HOME}, in this code sample these files assumed to match pattern test_*.dat, perhaps you do not intend to scan absolutely all files in your home directory (there could be many thousands of files but you interested in a few only)
#!/usr/bin/env perl
use strict;
use warnings;
use feature 'say';
my($dir,$re,$logfile);
$dir = '/home/vikas/';
$re = qr/the file_is being created_automaically \{\s+period=20ns\s+\}/;
$logfile = $dir . 'logfile.txt';
unlink $logfile if -e $logfile;
for ( glob($dir . "test_*.dat") ) {
if( open my $fh, '<', $_ ) {
my $data = do { local $/; <$fh> };
close $fh;
logger($logfile, "INFO: $_ contains the required string")
if $data =~ /$re/gsm;
} else {
logger($logfile, "WARN: unable to open $_");
}
}
exit 0;
sub logger {
my $fname = shift;
my $text = shift;
open my $fh, '>>', $fname
or die "Couldn't to open $fname";
say $fh $text;
close $fh;
}
Reference: regex modifies, unlink, perlvar
I'm trying to capture the output my perl code generates both from print and similar statements and external commands.
Due to design constraints I can't use solutions like Capture::Tiny. I need to forward the output to the buffer variable as soon as it is generated and I need to be able to differentiate between STDOUT and STDERR. Ideally a solution for external commands would essentially work just like system apart from being able to capture STDOUT and STDERR instead of printing them.
My code is supposed to:
Save the old STDOUT/STDERR file handles.
Create a new ones for both STDERR and STDOUT.
Redirect all the output to this place.
Print a couple of things.
Restore the old filehandles.
Do something with the captured output, e.g. print it.
However I'm unable to capture the output generated from external commands. I can't do it with IPC::Run3 nor with IPC::Open3.
#!/usr/bin/perl -CSDAL
use warnings;
use strict;
use IPC::Open3;
#use IPC::Run3;
# Save old filehandles
open(my $oldout, ">&STDOUT") or die "Can't dup STDOUT: $!";
open(my $olderr, ">&STDERR") or die "Can't dup STDERR: $!";
my $buffer = "";
close(STDOUT);
close(STDERR);
open(STDOUT, '>', \$buffer) or die "Can't redirect STDOUT: $!";
*STDERR = *STDOUT; # In this example STDOUT and STDERR are printed to the same buffer.
print "1: Test\n";
#run3 ["date"], undef, \*STDOUT, \*STDERR; # This doesn't work as expected
my $pid = open3("<&STDIN", ">&STDOUT", ">&STDERR", "date");
waitpid($pid,0); # Nor does this.
print STDERR "2: Test\n";
open(STDOUT, ">&", $oldout) or die "Can't dup \$oldout: $!";
open(STDERR, ">&", $olderr) or die "Can't dup \$olderr: $!";
print "Restored!\n";
print $buffer;
Expected result:
Restored!
1: Test
Mo 25. Mär 13:44:53 CET 2019
2: Test
Actual result:
Restored!
1: Test
2: Test
I don't have a solution to offer you, however I can provide some explanations as to the behavior you are seeing.
First, IPC::Open3 is not supposed to work when your filehandles are variables; see this question for more explanations.
Now, why isn't IPC::Run3 working? First, notice that if don't redirect STDERR and run
run3 ["date"], undef, \$buffer, { append_stdout => 1 };
instead of
run3 ["date"], undef, \*STDOUT;
then it works as expected. (you need to add { append_stdout => 1 } or your previous outputs to $buffer will be overwritten)
To understand what's happening, in your program, after
open(STDOUT, '>', \$buffer) or die "Can't redirect STDOUT: $!";
Add
print STDERR ref(\$buffer), "\n"
print STDERR ref(\*STDOUT), "\n"
Which will print
SCALAR
GLOB
That's exactly what IPC::Run3::run3 will do to know what to do with the "stdout" you give it (see the source: _fh_for_child_output, which is called by run3):
if it's a scalar, then a temporary file is used (the corresponding line is $fh = $fh_cache{$what} ||= tempfile, where tempfile is a function from File::Temp.
On the other hand, when stdout is a GLOB (or tied to IO::Handle), that filehandle is used directly (that's this line of code).
Which explains why when you call run3 with \$buffer it works, but not with \*STDOUT.
When redirecting STDERR as well, and calling
run3 ["date"], undef, \$buffer, \$buffer, { append_stdout => 1, append_stderr => 1 };
, things start to appear weird. I don't understand what's happening, but I'll share here what I found, and hopefully someone will make sense of it.
I modified the source of IPC::Run3 and added
open my $FP, '>', 'logs.txt' or die "Can't open: $!";
at the beginning of the sub run3. When running, I only see
Restored!
1: Test
on STDOUT (my terminal), but logs.txt contains the date (something in the lines of Mon Mar 25 17:49:44 CET 2019).
Investing a bit reveals that fileno $FP returns 1 (which, unless I mistaken, is usually STDOUT (but you closed it, so I'm no so surprised that its descriptor can be reused)), and fileno STDOUT returns 2 (this might depend on your Perl version and other opened filehandles though). What seems to be happening is that system assumes that STDOUT is the file descriptor 1 and thus prints to $FP instead of STDOUT (I'm just guessing though).
Please feel free to comment/edit if you understand what's happening.
I ended up with the following code:
#!/usr/bin/perl -CSDAL
use warnings;
use strict;
use IPC::Run3;
use IO::Scalar;
use Encode;
use utf8;
# Save old filehandles
open(my $oldout, ">&STDOUT") or die "Can't dup STDOUT: $!";
open(my $olderr, ">&STDERR") or die "Can't dup STDERR: $!";
open(my $FH, "+>>:utf8", undef) or die $!;
$FH->autoflush;
close(STDOUT);
close(STDERR);
open(STDOUT, '>&', $FH) or die "Can't redirect STDOUT: $!";
open(STDERR, '>&', $FH) or die "Can't redirect STDOUT: $!";
print "1: Test\n";
run3 ["/bin/date"], undef, $FH, $FH, { append_stdout => 1, append_stderr => 1 };
print STDERR "2: Test\n";
open(STDOUT, ">&", $oldout) or die "Can't dup \$oldout: $!";
open(STDERR, ">&", $olderr) or die "Can't dup \$olderr: $!";
print "Restored!\n";
seek($FH, 0, 0);
while(<$FH>)
{
# No idea why this is even required
print Encode::decode_utf8($_);
}
close($FH);
This is far from what I originally wanted, but appears to be working at least.
The issues I have with this are:
I need an anonymous file handle creating clutter on the hard disk.
For some reason I need to fix the encoding manually.
Thank you very much to the people who dedicated their time helping me out here.
Is there a reason you need to use the parent's STDOUT and STDERR? IPC::Open3 is easily capable of redirecting the child's STDOUT and STDERR to unrelated handles in the parent which you can read from.
use strict;
use warnings;
use IPC::Open3;
my $pid = open3 undef, my $outerr, undef, 'date';
my $output = do { local $/; readline $outerr };
waitpid $pid, 0;
my $exit = $? >> 8;
This will read STDOUT and STDERR together, if you want to read them separately you need to pass my $stderr = Symbol::gensym as the third argument (as shown in the IPC::Open3 docs), and use a non-blocking loop to avoid deadlocking when reading both handles. IO::Async::Process or similar can fully automate this for you, but IPC::Run3 provides a much simpler solution if you only need to store the output in scalar variables. IPC::Run3 and Capture::Tiny can also both easily be fatpacked for deployment in scripts.
This is not an answer yet, but it seems like open3 requires STDOUT to be a regular tty file handle at the time you call open3, for example:
use feature qw(say);
use strict;
use warnings;
use IPC::Open3;
use Symbol 'gensym';
{
local *STDOUT; # <-- if you comment out this line open3 works as expected
my ($chld_in, $chld_out);
my $chld_err = gensym;
my $pid;
eval {
$pid = open3($chld_in, $chld_out, $chld_err, "date");
};
if ( $# ) {
say "IPC::Open::open3 failed: '$#'";
}
print "-> $_" for <$chld_out>;
waitpid $pid, 0;
# say "Cannot print to invalid handle..";
}
say "ok";
Output:
ma. 25. mars 16:00:01 +0100 2019
ok
Note that the arrow -> in the beginning of the line is missing,
so nothing can be read from $chld_out in this case. However, if I comment out the line:
local *STDOUT;
The output is:
-> ma. 25. mars 16:01:10 +0100 2019
ok
I am simply trying to find out how to properly use the open2 function.
See an example below. It works for a small $max, but naturally, if I write long enough to the $hIn, so eventually it will get blocked because nothing reads the data on the output continuously.
use 5.26.0;
use IPC::Open2;
my $max = 100000;
my $pid = open2(my $hOut, my $hIn, "cat") || die "failed 'cat': $!";
{
my $cnt = 0;
#When $max is big (e.g. 100000) so the code below will get blocked
#when writing to $hIn
while ($cnt<$max) {say $hIn $cnt++;}
close($hIn) || say "can't close hIn";
}
while(<$hOut>) { print; }
close($hOut) || say "can't close hOut";
waitpid( $pid, 0 );
The only solution, that I can think about, is launching an other thread that will do the writing on the background.
With the code below I can write into the $hIn as much data as I want and read them in the main thread but the $hIn seems not to get closed. Because of that the while(<$hOut>) will never finish while waiting for more output.
use 5.26.0;
use threads;
use IPC::Open2;
my $max = 100000;
my $pid = open2(my $hOut, my $hIn, "cat") || die "failed 'cat': $!";
my $thr = threads->create(sub {
my $cnt = 0;
while ($cnt<$max) {say $hIn $cnt++;}
#The close does not have any effect here (although no error is produced)
close($hIn) || say "can't close hIn";
});
#This outputs all the data written to $hIn but never leaves the loop...
while(<$hOut> ) { print; }
close($hOut) || say "can't close hOut";
$thr->join;
waitpid( $pid, 0 );
My questions are:
Provided that my approach with threads is ok, how can I close the file handle from the background thread?
If it is not ok (actually use threads is discouraged in Perl), so can someone provide a working example of open2 that can write and read a lot of data without a risk of getting blocked waiting for the reader or writer?
EDIT: Following your suggestions here is an implementation of the code above using IPC::Run:
use 5.26.0;
use IPC::Run qw/ run /;
my $max = 1000000;
run sub {
my $cnt = 0;
while ($cnt<$max) {say $cnt++;}
},
"|", "cat",
"|", sub {
while(<> ) {
print;
}
}
or die "run sub | cat | sub failed: $?";
It runs without flaws, the code is very readable... I am very happy to have learned about this module. Thanks to everyone!
Yet, I consider the question to be unanswered. If it is not possible to write this functionality using open2 directly, why does that even exist and confuse people? Also not being able to close the file handle from a different thread looks like a bug to me (certainly it is - the close should at least report an error).
Your program stopped because the pipe to which it was writing became full.
The pipe to cat became full because cat stopped reading from it.
cat stopped because the pipe to which it was writing became full.
The pipe from cat became full because you program isn't reading from it.
So you have two programs waiting for each other to do something. This is a deadlock.
The low-level solution is to use select to monitor both ends of the pipe.
The high-level solution is to let IPC::Run or IPC::Run3 do that hard work for you.
use IPC::Run qw( run );
my $cnt_max = 100000;
my $cnt = 0;
run [ "cat" ],
'<', sub { $cnt < $cnt_max ? $cnt++ . "\n" : undef };
I am looking to run two subroutines in parallel, where both perform the same task on an android handset using ADB commands. With help from SO and other research I have produced the following code below, however I am new to multithreading and I get an error of 'Free to wrong pool' during execution. I am assuming I get this as I am using the $_ variable in both threads, is this correct? I am using Windows7 to run this, but my Perl interpreter crashes on running this script. Any guidance would be greatly appreciated. Thanks.
use strict;
use Win32::OLE;
use EFS_Handle;
use HelperFunctions;
use threads;
#### Various ADB command sequences follow ####
#### Start of multithread to run the same function on different android handsets ####
my #jobs;
push #jobs, threads->create(
sub {
print "\n\t" . curTime() . " :\t DUT Time at start of MPLMN search";
open my $fh1, '>', "output.txt" or die "Cannot open output.txt: $!";
my $pid1 = open my $log1, "-|", "adb -s 42d8d7dd logcat";
system('adb -s 42d8d7dd shell input keyevent KEYCODE_ENTER');
while (<$log1>) {
$fh1->print($_);
last if m/Sorted scan results/;
}
kill "TERM", $pid1;
close $log1;
print "\n\t" . curTime() . " :\t DUT Time at End of MPLMN search\n";
}
);
push #jobs, threads->create(
sub {
print "\n\t" . curTime() . " :\t REF Time at start of MPLMN search";
open my $fh, '>', "output.txt" or die "Cannot open output.txt: $!";
my $pid = open my $log, "-|", "adb -s 0123456789ABCDEF logcat";
system('adb -s 0123456789ABCDEF shell input keyevent KEYCODE_ENTER');
while (<$log>) {
$fh->print($_);
last if m/EVENT_NETWORK_SCAN_COMPLETED/;
}
kill "TERM", $pid;
close $log;
print "\n\t" . curTime() . " :\t REF Time at End of MPLMN search\n";
}
);
$_->join for #jobs;
The Win32::OLE module is famously not thread-safe
If you remove use Win32::OLE (you don't seem to use it) then your code will run fine
If have my doubts about adb cooperating with multiple simultaneous commands, but that is a different matter
I think the problem could be related to the fact you are writing from both threads to the same file "output.txt" with ">". Try opening them with ">>".
Also remember to close that file.
"I am assuming I get this as I am using the $_ variable in both threads..."
If you use strict, and $_ is in separate methods/subroutines, then there should be no global-access problems.
My Expectation:
I have to use the following command to send the value of first argument to all the files calling perl.pl file.
./perl.pl 1
The one is read using the following file: (perl.pl)
#!/usr/bin/perl
package Black;
use strict;
use warnings;
#subroutines
sub get_x();
#variables
our $XE = -1;
my ($param1, $param2, $param3) = #ARGV;
my $x = get_x();
sub get_x()
{
$XE = $param1;
return $XE;
}
exit;
Then I wrote another script which performs some code base on the input to perl.pl (0 or 1).
The file is ./per.pl and I invoke in from linux terminal like this: ./per.pl
Here is the code I wrote for it:
#!/usr/bin/perl
require "perl.pl";
my $xd = Black::get_x();
if ($xd ==1){
print $xd;}
else {
print "5";
}
exit;
But this is what I get when I write these commands:
./perl.pl 1
I tried to print it and it prints 1...removed the print like from the code in this case
./per.pl
And now I get nothing. I would like the 1 getting printed out but no it doesn't
Thanks in Advance
Before we get started, you cannot possibly get the output you say you get because you tell the process to exit when the module is executed by require, so Black::get_x() is never reached. Change exit; to 1;.
Now on to your question. If I understand correctly, you want to pass a value to one process via its command line, and fetch that value by executing the same script without the parameter.
You did not even attempt to pass the variable from one process to another, so it shouldn't be a surprise that it doesn't work. Since the two processes don't even exist at the same time, you'll need to store the value somewhere such as the file system.
#!/usr/bin/perl
use strict;
use warnings;
my $conf_file = "$ENV{HOME}/.black";
my $default = -1;
sub store {
my ($val) = #_;
open(my $fh, '>', $conf_file) or die $!;
print $fh "$val\n";
return $val;
}
sub retrieve {
open(my $fh, '<', $conf_file)
or do {
return $default if $!{ENOENT};
die $!;
};
my $val = <$fh>;
chomp($val);
return $val;
}
my $xd = #ARGV ? store($ARGV[0]) : retrieve();
print("$xd\n");