Providing a status update when "ENTER" is pressed, while program is working - multithreading

I have a PERL script that loops through and calls a binary with a different argument. I am using IPC::Run. I would like when the user presses a key such as "ENTER" a status message is displayed such as
"Currently working on 14 of 28 total scripts (50% complete)"
My script is as follows:
foreach my $file (#files) {
$file =~ s/$file_dir//;
#Run the test case, store the output in $stdout
run [ "php", "PROGRAM.phar", "$file" ], ">", \my $stdout;
print LOG_FILE "Return code $?\n";
print LOG_FILE "Output: $stdout");
}
Basically how would I interrupt the binary in order to display my status message?

If I correct this usage of IPC::Run is not multithreaded. It will execute the commands one by one and it is not possible to print messages because there is only one process.
Like:
use Parallel::ForkManager;
$pm = new Parallel::ForkManager($MAX_PROCESSES);
my $input;
foreach $data (#all_data) {
# Forks and returns the pid for the child:
my $pid = $pm->start and next;
... do some work with $data in the child process ...
$pm->finish; # Terminates the child process
chomp($input= <STDIN>);
print "Some statistics\n" if $input =~ m!\n!;
}
Regards,

Related

Need Commands to be run in parallel in perl expect

I have a script as mentioned below, I see the command mentioned as "Command which needs to be run" (in for loop) executes for some time and later the process which was forked to execute the command is killed and thus only part of my script in "Command which needs to be run" (in for loop) is executed. Instead, I want the command in "Command which needs to be run" (in for loop) to all run in parallel i.e. if the for loop runs for 50 iterations, 50 different process needs to be executing in parallel and none should be stopped in between. Could some one please suggest on this ?
#!/usr/bin/perl
use Expect;
sub hostuser_expect() {
$expect= Expect->new;
$expect->raw_pty(1);
$expect->spawn($cmd)
or die "Cannot spawn $cmd: $!\n";
$expect->expect($timeout,
[ qr/.*\?/i, #/
sub {
my $self = shift;
$self->send("yes\r\n");
exp_continue;
}
]);
$expect->expect($timeout,
[ qr/password:/i, #/
sub {
my $self = shift;
$self->send("$password\n");
exp_continue;
}
]);
#$expect->expect(1500,'-re', 'Mails marked as moved successfully\.$');
#$expect->hard_close();
}
$timeout = 5;
$password = "changeme";
for ($i=1;$i<=50;$i++) {
$cmd="Command which needs to be run";
print "Invoking script - $cmd\n";
hostuser_expect();
sleep(30);
}
I believe Parallel::ForkManager will do what you want. I don't use Expect, so I've simplified the sub as an example. I've also added use strict; and use warnings;, removed the prototype parens from the sub definition, and changed from using a C-style for() loop implementation.
Just change the number in the call to Parallel::ForkManager->new() to raise or lower the maximum number of forks to be executing at any one time.
use warnings;
use strict;
use Parallel::ForkManager;
sub hostuser_expect {
my $num = shift;
print "in child $num\n";
}
my $pm = Parallel::ForkManager->new(5);
COMMANDS:
for (1..5){
$pm->start and next COMMANDS;
hostuser_expect($_);
$pm->finish;
}
$pm->wait_all_children;
Output:
in child 1
in child 2
in child 5
in child 3
in child 4

Track and kill a process on timeout using Perl script

I want to write a Perl script which can monitor a running process. If the process executes for more than expected time,then it should be killed.
I am trying to do this on a Linux machine(Linux_x8664).
I cannot achieve the same using cronjob because I want to embed the same to another Perl script, which I have been using from a long time.
If you have any suggestions, Please suggest me.
I have a code to do that, But the problem is that my perl script is rinning a process using system command. And I want to track the pid of that invoked process and I want to kill it on timeout.
=========================
#!/usr/pde/bin/perl
my $pid;
my $finish=0;
# actions after timeout to keep SIGHANDLER short
#
sub timeout {
print "Timed out pid $pid\n";
# kill the process group, but not the parent process
local $SIG{INT}='IGNORE';
local $SIG{TERM}='IGNORE';
kill 'INT' = -$$;
# eventually try also with TERM and KILL if necessary
die 'alarm';
}
eval {
local $SIG{ALRM} = sub { $finish=1 };
alarm 5;
die "Can't fork!" unless defined ($pid=fork); # check also this!
if ($pid) { # parent
warn "child pid: $pid\n";
# Here's the code that checks for the timeout and do the work:
while (1) {
$finish and timeout() and last;
sleep 1;
}
waitpid ($pid, 0);
}
else { # child
exec (q[perl -e 'while (1) {print 1}' tee test.txt]);
exit; # the child shouldn't execute code hereafter
}
alarm 0;
};
warn "\$#=$#\n";`enter code here`
die "Timeout Exit\n" if $# and $# =~ /alarm/;
print "Exited normally.\n";
__END__
Based on your code - there is a reason why use strict and use warnings are strongly recommended.
Specifically:
Can't modify constant item in scalar assignment at line 17, near "$$;"
You aren't doing what you think you're doing there.
If you set it to
kill ( 'INT', -$$ );
Then you will send a SIGINT to the current process group - parent and child. I'm not sure why you're doing this when you don't want to kill the parent.
I'd suggest you can simplify this greatly by:
else { # child
alarm 5;
exec (q[perl -e 'while (1) {print 1}' tee test.txt]);
exit; # the child shouldn't execute code hereafter
}

Perl: Run multiple system commands at once

In perl, I have some code like
my $enter = `curl -s -m 10 http://URL`;
How would I use threading to run this function 10 times at once?
I found this but I am not sure how to use it to set a specific amount of threads
Edit: I guess I misunderstood what Threads::Queue was doing. My original question still stands for simultaneously running multiple commands at once.
You can use fork(). In this example, I use the Parallel::ForkManager module. $max_forks is the number of processes to run simultaneously (set to two for an example), and you'd put your system/curl code after ### add curl logic here, and remove the print() and sleep() example statements from there as well.
#!/usr/bin/perl
use warnings;
use strict;
use Parallel::ForkManager;
my $max_forks = 2;
my $fork = new Parallel::ForkManager($max_forks);
my #urls = (
'http://perlmonks.org',
'http://stackoverflow.com',
'http://slashdot.org',
'http://wired.com',
);
# on start callback
$fork->run_on_start(
sub {
my $pid = shift;
print "Starting PID $pid\n";
}
);
# on finish callback
$fork->run_on_finish(
sub {
my ( $pid, $exit, $ident, $signal, $core) = #_;
if ($core){
print "PID $pid core dumped.\n";
}
else {
print "PID $pid exited with exit code $exit " .
" and signal $signal\n";
}
}
);
# forking code
for my $url (#urls){
$fork->start and next;
### add curl logic here
print "$url\n";
sleep(2);
$fork->finish;
}
$fork->wait_all_children;

Standalone child in backtick command

Here is a main script that exec the perl script "fork.pl"
#!/bin/bash
OUTPUT=`./fork.pl`
echo "$OUTPUT"
And the fork.pl:
#!/usr/bin/perl
use strict;
use warnings;
use POSIX;
my $pid = fork();
if ($pid == 0) {
sleep(5);
print("child: $pid\n");
}
else {
print("parent: $pid\n")
}
The backtick implies a wait, but I would like to not wait for the last child.
thanks
One of the ways to not to wait for the termination, is to start in the background while redirecting the output to a file. Then try to read the lines with the shell's read.
For example, a hack to read the first line:
./fork.pl > temp.out &
sleep 1
read OUTPUT < temp.out
Alternatively, without sleep, but limited to a do/done block:
./fork.pl | while read OUTPUT; do
# use $OUTPUT here
break # first line only, or loop conditionally
done
It needs to detach from parent and to redirect the input/output :
if ($pid == 0) {
my $mysid = setsid();
open (STDIN, "</dev/null");
open (STDOUT, ">/dev/null");
open (STDERR, ">&STDOUT");
sleep(5);
print("child: $pid\n");
}

A way to parse terminal output / input ? (.bashrc ?)

Is there a way to parse input and output from bash commands in an interactive terminal before they reach the screen ? I was thinking maybe something in .bashrc, but I'm new to using bash.
For example:
I type "ls /home/foo/bar/"
That gets passed through a script that replaces all instances of 'bar' with 'eggs'
"ls /home/foo/eggs/" gets executed
The output gets sent back to the replace script
The output of the script is sent to the screen
Yes. Here's something I wrote for my own use, to wrap old command line Fortran programs that ask for file-paths. It allows escaping back to the shell for e.g. running 'ls'. This only works one way, i.e. intercepts user-input and then passes it on to a program, but gets you most of what you want. You can adapt it to your needs.
#!/usr/bin/perl
# shwrap.pl - Wrap any process for convenient escape to the shell.
# ire_and_curses, September 2006
use strict;
use warnings;
# Check args
my $executable = shift || die "Usage: shwrap.pl executable";
my #escape_chars = ('#'); # Escape to shell with these chars
my $exit = 'exit'; # Exit string for quick termination
open my $exe_fh, "|$executable #ARGV" or die "Cannot pipe to program $executable: $!";
# Set magic buffer autoflush on...
select((select($exe_fh), $| = 1)[0]);
# Accept input until the child process terminates or is terminated...
while ( 1 ) {
chomp(my $input = <STDIN>);
# End if we receive the special exit string...
if ( $input =~ m/$exit/ ) {
close $exe_fh;
print "$0: Terminated child process...\n";
exit;
}
foreach my $char ( #escape_chars ) {
# Escape to the shell if the input starts with an escape character...
if ( my ($command) = $input =~ m/^$char(.*)/ ) {
system $command;
}
# Otherwise pass the input on to the executable...
else {
print $exe_fh "$input\n";
}
}
}

Resources