Pass a indicator from Bash back to Perl over SSH via STDIN - linux

We have a Linux server which can run a diagnostic script, diag.pl, which coordinates reporting over other servers.
diag.pl iterates over the child servers, and for each of them, SSHs in and runs a bash script, which passes information back:
my $cmd=sprintf("ssh %s sudo /usr/lib/support/report.sh -e %s | uudecode -o \"%s-outfile.tgz\") 2>%1 |", $server, $specialparam, $servername)
The line of code in report.sh that sends the data back is:
uuencode --base64 ${REPORT}.tar.gz /dev/stdout
I would like to update report.sh to send back an additional line of information, something like:
echo "special-file-found=${SFF}" > /tmp/sff.cfg
uuencode --base64 /tmp/sff.cfg > /dev/stdout
Once the special file has been found, the Perl script will update so that it no longer sends the specialparam back to subsequent report.sh calls.
Is there a good way to send that input so that it will be easy for Perl to catch it?
What have I tried
Setting a user.comment attr on the tar.gz using setattr, but the comment does not survive the uuencoding
Currently thinking that my best bet is to use the pseudocode above, creating a new file to encode and send along, and update the Perl script to check it with each new transmission until it finds the special file.

I take it that the objective is to modify a shell script which returns to the caller an encoded file, so that it sends yet more information, specifically a string to be used as a flag in the caller.
It is not clear how the shell script is run from the Perl script, but there are ways to do this so that the caller gets back separate "lines" that are printed, either as they are emitted or altogether after the run completes.
Then you can just add to the shell script the needed extra print to STDOUT, and in the caller check each line of shell output to see whether it conforms to some "protocol;" for example, whether it is, or starts with, special-file-found string. Then you can set flags for further calls or write control file for following runs, etc. Otherwise, the line is the encoded file.
A made-up basic example using pipe-open (see by the end of the page)
use warnings;
use strict;
use feature 'say';
my #cmd = qw(ls -l ./);
my $file_found = quotemeta 'special-file-found';
my ($flag, $binfile);
my $pid = open(my $out, '-|', #cmd) // die "Can't open #cmd: $!";
while (<$out>) {
chomp;
if (/^$file_found/) {
$flag = 1;
}
else {
$binfile = $_;
# whatever else need be done, or perhaps last;
}
}
close $out;
This example runs the command ls -l ./ but instead of it you can run any executable, like #cmd = ('report.sh', 'arg1', 'arg2',...).
Another way is to use backticks (qx) and assign its return to an array, in which case each element receives a line of output.
Yet another, better, way is to use a module which manages external commands. For example, from simple to more capable: IPC::System::Simple, Capture::Tiny, IPC::Run3, IPC::Run.

Related

How to understand and avoid non-interactive mode errors when running ispell from script?

Background
Ispell is a basic command line spelling program in linux, which I want to call for a previously collected list of file names. These file names are recursively collected from a latex root file for example. This is usefull when requiring to spell all recursively included latex files, and no other files. However, calling ispell from the command line turns out to be non-trivial as ispell gives errors of the form
"Can't deal with non-interactive use yet." in some cases.
(As a side not, ideally I would like to call ispell programmatically from java using the ProcessBuilder class, and without requiring bash. The same error seems to pester this approach however.)
Question
Why is it that ispell gives the error "Can't deal with non-interactive use yet." in certain cases, when called in bash from a loop involving the read method, but not in other cases, as shown in the below code example?
The below minimal code example creates two small files
(testFileOne.txt, testFileTwo.txt) and a file containing the paths of the two created files (testFilesListTemp.txt).
Next, ispell is called for testFilesListTemp.txt in three different ways:
1. With the help of "cat"
2. By first collecting the names as a string, then looping over the substrings in the collected string, and calling ispell for each of them.
3. By looping over the contents of testFilesListTemp.txt directly, and
calling ispell for the extracted paths.
For some reaons the third method does not work, and yields an error
"Can't deal with non-interactive use yet.". Why exactly does this error
occur, and how can it be prevented, and/or is there perhaps another variation
of the third approach that would work without errors?
#!/bin/bash
#ispell ./testFiles/ispellTestFile1.txt
# Creating two small files and a file with file paths for testing
printf "file 1 contents" > testFileOne.txt
printf "file 2 contents. With a spelling eeeeror." > testFileTwo.txt
printf "./testFileOne.txt\n./testFileTwo.txt\n" > testFilesListTemp.txt
COLLECTED_LATEX_FILE_NAMES_FILE=testFilesListTemp.txt
# Approach 1: produce list of file names with cat and
# pass as argumentto ispell
# WORKS
ispell $(cat $COLLECTED_LATEX_FILE_NAMES_FILE)
# Second approach, first collecting file names as long string,
# then looping over substrings and calling ispell for each one of them
FILES=""
while read p; do
echo "read file $p"
FILES="$FILES $p"
done < $COLLECTED_LATEX_FILE_NAMES_FILE
printf "files list: $FILES\n"
for latexName in $FILES; do
echo "filename: $latexName"
ispell $latexName
done
# Third approach, not working
# ispell compmlains in this case about not working in non-interactive
# mode
#: "Can't deal with non-interactive use yet."
while read p; do
ispell "$p"
done < $COLLECTED_LATEX_FILE_NAMES_FILE
The third example does not work, because you redirect standard input. ispell needs a terminal and a user interaction. When you write code like this:
while read p; do
ispell "$p"
done < $COLLECTED_LATEX_FILE_NAMES_FILE
everything that is read from standard input by any program within the loop will be taken from the $COLLECTED_LATEX_FILE_NAMES_FILE file. ispell detects that and refuses operating. However, you can use "description redirection" to make read p read from the file, and ispell "$p" read from the "real" terminal. Just do:
exec 3<&0
while read p; do
ispell "$p" 0<&3
done < $COLLECTED_LATEX_FILE_NAMES_FILE
exec 3<&0 "copies" (saves) your standard input (0, the "terminal") to descriptor 3. And later on you redirect standard input (0) to ispell from that descriptor, by typing 0<&3 (you can omit 0 if you like).

Perl set and get env in different bash script

I have created a perl script which invokes two bash script. First script will set a envirnomental variable and the second will echo the environmental variable. I have given the contents of the files bellow
# perlscript.pl
print `. setnameenv.sh`;
print `. getnameenv.sh`;
# setnameenv.sh
export my_msg='hello world!'
# getnameenv.sh
echo $my_msg
now when I run the perl script perl perlscript.pl I am expecting the 'hello world' to be printed on the screen but actually I don't see any output. I there any way to do this without modifying the bash scripts?
You can embed perl into bash script,
#!/bin/bash
. setnameenv.sh
exec perl -x "$0" "$#"
#!perl
# your script below
print `. getnameenv.sh`;
From perldoc
-x
-xdirectory
tells Perl that the program is embedded in a larger chunk of unrelated text, such as in a mail message. Leading garbage will be discarded until the first line that starts with #! and contains the string "perl". Any meaningful switches on that line will be applied.
You spawn a shell, execute some commands to change its environment, then exit the shell. You never used the environment variable you created before exiting the shell. If you want a perl to see it, you're going to have to launch Perl from that shell.
. setnameenv.sh ; perlscript.pl
If you can't change how perlscript.pl is launched, you have a couple of options, none of which are that friendly. One of the options is to bootstrap.
BEGIN {
if (!length($ENV{my_msg})) {
require String::ShellQuote;
my $cmd = join(' ; ',
'. setnameenv.sh',
String::ShellQuote::shell_quote($^X, $0, #ARGV),
);
exec($cmd)
or die $!;
}
}
This can now be done in Perl with the Env::Modify module.
use Env::Modify qw(source);
source("setnameenv.sh");
# env settings from setnameenv.sh are now available to Perl
# and to the following system call
print `. getenvname.sh`; # or source again, like source("getenvname.sh")
The child process can inherit the parent's environment but cannot make any changes. Similarly the parent cannot have access to the child's environment as well. Hence to catch environment of the child in parent the child should print the values as shown in the bellow code. The below code will set already existing environment variables as well, but this can be optimized
# perlscript.pl
my $env_val = `. setnameenv.sh; env`;
my #env_list = split "\n", $env_str;
foreach (#env_list)
{
/([\w_]+)=(.*)/;
$ENV{$1} = $2;
}
print `. getnameenv.sh`;
find the actual explanation in this SO answer
Variables are only exported for the child processes.
You cannot export variables back to the father process.
You'll need another way to transport variables back to the father or the brothers.
For example, here is a example where all exported variables are saved and read from a file :
#!/bin/dash
# setnameenv.sh
export my_msg='hello world!'
export > savedVariables.sh
and
#!/bin/dash
# getnameenv.sh
. ./savedVariables.sh
echo "$my_msg"
Note : this works with dash. bash generates one line he cannot read back.

Unix: What does cat by itself do?

I saw the line data=$(cat) in a bash script (just declaring an empty variable) and am mystified as to what that could possibly do.
I read the man pages, but it doesn't have an example or explanation of this. Does this capture stdin or something? Any documentation on this?
EDIT: Specifically how the heck does doing data=$(cat) allow for it to run this hook script?
#!/bin/bash
# Runs all executable pre-commit-* hooks and exits after,
# if any of them was not successful.
#
# Based on
# http://osdir.com/ml/git/2009-01/msg00308.html
data=$(cat)
exitcodes=()
hookname=`basename $0`
# Run each hook, passing through STDIN and storing the exit code.
# We don't want to bail at the first failure, as the user might
# then bypass the hooks without knowing about additional issues.
for hook in $GIT_DIR/hooks/$hookname-*; do
test -x "$hook" || continue
echo "$data" | "$hook"
exitcodes+=($?)
done
https://github.com/henrik/dotfiles/blob/master/git_template/hooks/pre-commit
cat will catenate its input to its output.
In the context of the variable capture you posted, the effect is to assign the statement's (or containing script's) standard input to the variable.
The command substitution $(command) will return the command's output; the assignment will assign the substituted string to the variable; and in the absence of a file name argument, cat will read and print standard input.
The Git hook script you found this in captures the commit data from standard input so that it can be repeatedly piped to each hook script separately. You only get one copy of standard input, so if you need it multiple times, you need to capture it somehow. (I would use a temporary file, and quote all file name variables properly; but keeping the data in a variable is certainly okay, especially if you only expect fairly small amounts of input.)
Doing:
t#t:~# temp=$(cat)
hello how
are you?
t#t:~# echo $temp
hello how are you?
(A single Controld on the line by itself following "are you?" terminates the input.)
As manual says
cat - concatenate files and print on the standard output
Also
cat Copy standard input to standard output.
here, cat will concatenate your STDIN into a single string and assign it to variable temp.
Say your bash script script.sh is:
#!/bin/bash
data=$(cat)
Then, the following commands will store the string STR in the variable data:
echo STR | bash script.sh
bash script.sh < <(echo STR)
bash script.sh <<< STR

Executing a bash script from a Perl program

I'm trying to write a Perl program which will execute a bash script. The Perl script looks like this
#!/usr/bin/perl
use diagnostics;
use warnings;
require 'userlib.pl';
use CGI qw(:standard);
ReadParse();
my $q = new CGI;
my $dir = $q->param('X');
my $s = $q->param('Y');
ui_print_header(undef, $text{'edit_title'}.$dir, "");
print $dir."<br>";
print $s."<br>";
print "Under Construction <br>";
use Cwd;
my $pwd = cwd();
my $directory = "/Logs/".$dir."/logmanager/".$s;
my $command = $pwd."/script ".$directory."/".$s.".tar";
print $command."<br>";
print $pwd."<br>";
chdir($directory);
my $pwd1 = cwd();
print $pwd1."<br>";
system($command, $directory) or die "Cannot open Dir: $!";
The script fail with the following error:
Can't exec "/usr/libexec/webmin/foobar/script
/path/filename.tar": No such file or directory at /usr/libexec/webmin/foobar/program.cgi line 23 (#3)
(W exec) A system(), exec(), or piped open call could not execute the
named program for the indicated reason. Typical reasons include: the
permissions were wrong on the file, the file wasn't found in
$ENV{PATH}, the executable in question was compiled for another
architecture, or the #! line in a script points to an interpreter that
can't be run for similar reasons. (Or maybe your system doesn't support #! at all.)
I've checked that the permissions are correct, the tar file I'm passing to my bash script exists, and also tried from the command line to run the same command I'm trying to run from the Perl script ( /usr/libexec/webmin/foobar/script /path/filename.tar ) and it works properly.
In Perl, calling system with one argument (in scalar context) and calling it with several scalar arguments (in list context) does different things.
In scalar context, calling
system($command)
will start an external shell and execute $command in it. If the string in $command has arguments, they will be passed to the call, too. So for example
$command="ls /";
system($commmand);
will evaluate to
sh -c "ls /"
where the shell is given the entire string, i.e. the command with all arguments. Also, the $command will run with all the normal environment variables set. This can be a security issue, see here and here for a few examples why.
On the other hand, if you call system with an array (in list context), Perl will not call a shell and give it the $command as argument, but rather try to execute the first element of the array directly and give it the other arguments as parameters. So
$command = "ls";
$directory = "/";
system($command, $directory);
will call ls directly, without spawning a shell in between.
Back to your question: your code says
my $command = $pwd."/script ".$directory."/".$s.".tar";
system($command, $directory) or die "Cannot open Dir: $!";
Note that $command here is something like /path/to/script /path/to/foo.tar, with the argument already being part of the string. If you call this in scalar context
system($command)
all will work fine, because
sh -c "/path/to/script /path/to/foo.tar"
will execute script with foo.tar as argument. But if you call it in list context, it will try to locate an executable named /path/to/script /path/to/foo.tar, and this will fail.
I found the problem.
changed the system command removing the second parameter and now it's working
system($command) or die "Cannot open Dir: $!";
In fairness I did not understand what was wrong on first example but now works fine, if anyone can explain probably it can be interesting understand
There are multiple ways to execute bash command/ scripts in perl.
System
backquate
exec

Reading the path of files as string in shell script

My Aim -->
Files Listing from a command has to be read line by line and be used as part of another command.
Description -->
A command in linux returns
archive/Crow.java
archive/Kaka.java
mypmdhook.sh
which is stored in changed_files variable. I use the following while loop to read the files line by line and use it as part of a pmd command
while read each_file
do
echo "Inside Loop -- $each_file"
done<$changed_files
I am new to writing shell script but my assumption was that the lines would've been separated in the loop and printed in each iteration but instead I get the following error --
mypmdhook.sh: 7: mypmdhook.sh: cannot open archive/Crow.java
archive/Kaka.java
mypmdhook.sh: No such file
Can you tell me how I can just get the value as a string and not as a file what is opened. By the way, the file does exist which made me feel even more confused.(and later use it inside a command). I'd be happy with any kind of answer that helps me understand and resolve this issue.
Since you have data stored in a variable, use a "here string" instead of file redirection:
changed_files="archive/Crow.java
archive/Kaka.java
mypmdhook.sh"
while read each_file
do
echo "Inside Loop -- $each_file"
done <<< "$changed_files"
Inside Loop -- archive/Crow.java
Inside Loop -- archive/Kaka.java
Inside Loop -- mypmdhook.sh
Extremely important to quote "$changed_files" in order to preserve the newlines, so the while-read loop works as you expect. A rule of thumb: always quote variables, unless you knows exactly why you want to leave the quotes off.
What happens here is that the value of your variable $changed_files is substituted into your command, and you get something like
while read each_file
do
echo "Inside Loop -- $each_file"
done < archive/Crow.java
archive/Kaka.java
mypmdhook.sh
then the shell tries to open the file for redirecting the input and obviously fails.
The point is that redirections (e.g. <, >, >>) in most cases accept filenames, but what you really need is to give the contents of the variable to the stdin. The most obvious way to do that is
echo $changed_files | while read each_file; do echo "Inside Loop -- $each_file"; done
You can also use the for loop instead of while read:
for each_file in $changed_files; do echo "inside Loop -- $each_file"; done
I prefer using while read ... if there is a chance that some filename may contain spaces, but in most cases for ... in will work for you.
Rather than storing command's output in a variable use while loop like this:
mycommand | while read -r each_file; do echo "Inside Loop -- $each_file"; done
If you're using BASH you can use process substitution:
while read -r each_file; do echo "Inside Loop -- $each_file"; done < <(mycommand)
btw your attempt of done<$changed_files will assume that changed_files represents a file.

Resources