perl shell command variable error - linux

I am trying following code in one of my perl script and getting error, how do i execute following shell command and store in variable
#!/usr/bin/perl -w
my $p = $( PROCS=`echo /proc/[0-9]*|wc -w|tr -d ' '`; read L1 L2 L3 DUMMY < /proc/loadavg ; echo ${L1}:${L2}:${L3}:${PROCS} );
print $p;
Error:
./foo.pl
Bareword found where operator expected at /tmp/foo.pl line 3, near "$( PROCS"
(Missing operator before PROCS?)
syntax error at /tmp/foo.pl line 3, near "$( PROCS"
Unterminated <> operator at /tmp/foo.pl line 3.
What is wrong?

This:
my $p = $( PROCS=`echo /proc/[0-9]*|wc -w|tr -d ' '`; read L1 L2 L3 DUMMY < /proc/loadavg ; echo ${L1}:${L2}:${L3}:${PROCS} );
Isn't perl. It's how you'd execute a command in bash.
To run a command in perl you can:
use system.
put your command in backticks
qx (quote-execute): http://perldoc.perl.org/perlop.html#Quote-Like-Operators
However, you're enumerating a directory there, wordcounting, tr-ing and reading. So you don't actually need to do all that using a shell command. And indeed, I'd discourage you from doing so, because that's just a way to make a mess with no productive benefit.
Looks like what you're after as an end result is the 3 load average samples and a count of number of processes. Is that right?
In which case:
my $proc_count = scalar ( () = glob ( "/proc/[0-9]*" ));
open ( my $la, "<", "/proc/loadavg" ) or warn $!;
print join ( ":", split ( /\s+/, <$la> ), $proc_count ),"\n";
Something like that, anyway.

Simply printing a shell command in your Perl script won't actually execute it. You have to tell Perl that it's an external command, which you can do with system:
use strict;
use warnings;
my $command = q{
PROCS=`echo /proc/[0-9]*|wc -w|tr -d ' '`;
read L1 L2 L3 DUMMY < /proc/loadavg;
echo ${L1}:${L2}:${L3}:${PROCS}
};
system($command);
(Note that you should put use strict; use warnings; at the top of every Perl script you write.)
However, it's generally better to use native Perl functionality instead of system. All you're doing is reading from files, which Perl is perfectly capable of doing:
use strict;
use warnings;
use 5.010;
my #procs = glob '/proc/[0-9]*';
my $file = '/proc/loadavg';
open my $fh, '<', $file or die "Failed to open '$file': $!";
my $load = <$fh>;
say(join ':', (split ' ', $load)[0..2], scalar #procs);
Even better might be to use the Proc::ProcessTable module, which provides a consistent interface to the /proc filesystem across different flavors of *nix. It got some bad reviews early on but is supposedly getting bugfixes now; I haven't used it myself but you might take a look.

Related

Redirect STDERR in OPEN pipe comand. Perl Linux

The version of perl I'm restricted to is v5.10.0;
I have a script, "script1" that needs to call another script "script2."
In order to bypass the shell and avoid command line injection I am using "open" to do this in perl.
open(my $fh, "-|", "/path/to/script2", "-n", "$param") or say "failure $#";
I can not modify script2.
I need to get the output of the stderr from script2 into a variable in script1.
I got the "-|" syntax from https://perldoc.perl.org/functions/open.html but the description there is mostly an example of how to do it and not what it is doing so I can't figure out how to redirect err to out or even at least redirect err to some other variable.
I'm hoping for something that will look similar to one of these two options:
# option 1)
open(my $fh, "-|", "/path/to/script2", "-n", "$param") or say "failure $#";
while($line = <$fh>) { # stderr and stdout are both fed to $line
print $line;
}
# option 2)
open(STDERR, "+<", $stderr);
open(my $fh, "-|", "/path/to/script2", "-n", "$param") or say "failure $#";
while($line = <$fh>) { #stdout is fed to $line
print $line;
}
while($line = <$stderr>) { #stderr from script2 is fed to $line
print $line;
}
I would do this using IPC::Run3:
use strict;
use warnings;
use IPC::Run3;
run3 ['script2.pl', '-n', $param], undef, \my $out, \my $err;
This will give you the STDERR of script2.pl in the variable $err.
From the documentation:
run3($cmd, $stdin, $stdout, $stderr, \%options)
All parameters after $cmd are optional.
The parameters $stdin, $stdout and $stderr indicate how the child's
corresponding filehandle (STDIN, STDOUT and STDERR, resp.) will be
redirected. Because the redirects come last, this allows STDOUT and
STDERR to default to the parent's by just not specifying them -- a
common use case.
$cmd
Usually $cmd will be an ARRAY reference and the child is invoked via
system #$cmd;
Note that passing $cmd as an array reference will avoid running the command through the shell under the same conditions as for the system() command
If the objective is to capture streams one nice module is Capture::Tiny
use warnings;
use strict;
use Capture::Tiny qw(capture);
my $cmd = '/path/to/script2';
my #args = ('-n', $param);
my ($stdout, $stderr, $exit) = capture {
system($cmd, #args);
};
You can put the command and arguments in a list, #cmd = ('ls', '-l', './'). If the command and arguments are lumped into a scalar the shell may get used (if there are shell metacharacters).
This portable module allows you to
capture almost anything sent to STDOUT or STDERR, regardless of whether it comes from Perl, from XS code or from an external program.
There is also capture_stderr function if you only want that stream.
Note that many tools, including system, have a provision for bypassing the shell: pass command and/or arguments as a list. (However, if you need STDERR you cannot get it directly with system alone, without using shell redirection.)
If there is a possibility that #args would be empty this should be invoked as
system ( {$cmd} $cmd, #args );
and the shell is still avoided. A nice way to try it out is with $cmd = 'echo "From shell"'. Thanks to ikegami for noting all this in a comment.
See exec for this use of the indirect object notation, which
forces interpretation of the LIST as a multivalued list, even if there is only a single scalar in the list
and this ensures that the shell is never called.

How to use `diff` on files whose paths contain whitespace

I am trying to find the differences between files, but the filename and directory name contain white space. I am trying to execute the command in a Perl script.
diff /home/users/feroz/logs/back_up20161112/Security File/General Security.csv /home/users/feroz/logs/back_up20161113/Security File/General Security.csv
Perl
open( my $FH, '>', $logfile ) or die "Cannot open the file '$logfile' $!";
foreach $filename ( keys %filenames ) {
$old_file = $parent_directory . $previous_date . $search_directory . "$filenames{$filename}";
$new_file = $parent_directory . $current_date . $search_directory . "$filenames{$filename}";
if ( !-e $old_file ) {
#print ("\nFile does not exist in previos date backup");
print $FH "\nERROR:'$old_file' ---- does not exist in the backup directory ";
}
elsif ( !-e $new_file ) {
#print ("\n The file does not exist in current directory");
print $FH "\nERROR:'$new_file' --- does not exist in the present directory ";
}
else {
# print $FH "\nDifference between the files $filenames{$filename} of $previous_date and $current_date ";
my $cmd = 'diff $old_file $new_file| xargs -0';
open( my $OH, '|-', $cmd ) or die "Failed to read the output";
while ( <OH> ) {
print $FH "$_";
}
close $OH;
}
}
To be absolutly safe, use ShellQuote
use String::ShellQuote;
my $old_file2 = shell_quote($old_file);
my $new_file2 = shell_quote($new_file);
`diff $old_file2 $new_file2`;
Thank you for showing your Perl code
Single quotes don't interpolate, so that will pass the strings $old_file and $new_file to the command instead of those variables' contents. The shell will then try to interpret them as shell variables
I suggest that you write this instead
my $cmd = qq{diff '$old_file' '$new_file' | xargs -0};
open( my $OH, '-|', $cmd ) or die "Failed to read the output";
That will use double quotes (qq{...}) around the command string so that the variables are interpolated. The file paths have single quotes around them to indicate that the shell should treat them as individual strings
This won't work if there's a chance that your file paths could contain a single quote, but that's highly unusual
Pass arguments out-of-band to avoid the need to shell-quote them, rather than interpolating them into a string which is parsed by a shell as a script. Substituting filenames as literal text into a script generates exposure to shell injection attacks -- the shell-scripting equivalent to the family of database security bugs known as SQL injection.
Without Any Shell At All
The pipe to xargs -0 appears to be serving no purpose here. Eliminating it allows this to be run without any shell involved at all:
open(my $fh, "-|", "diff", $old_file, $new_file)
With Shell Arguments Passed Out-Of-Band From Script Text
If you really do want the shell to be invoked, the safe thing to do is to keep the script text an audited constant, and have it retrieve arguments from either the argv list passed to the shell or the environment.
# Putting $1 and $2 in double quotes ensures that the shell treats contents as literal
# the "_" is used for $0 in the shell.
$shell_script='diff "$1" "$2" | xargs -0'
open(my $fh, "-|",
"sh", "-c", $shell_script,
"_", $old_file, $new_file);
You can either
Put the whitespace path segment inside quotes
diff /home/users/feroz/logs/back_up20161112/"Security File"/General Security.csv /home/users/feroz/logs/back_up20161113/"Security File"/General Security.csv
or escape the whitespace
diff /home/users/feroz/logs/back_up20161112/Security\ File/General Security.csv /home/users/feroz/logs/back_up20161113/Security\ File/General Security.csv`

Why can't I print a very long string? [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 8 years ago.
Improve this question
I'm writing a Perl script that searches a kml file and I need to print a very long line of latitude/longitude coordinates. The following script successfully finds the string I'm looking for, but just prints a blank line instead of the value of the string:
#!/usr/bin/perl
# Strips unsupported tags out of a QGIS-generated kml and writes a new one
$file = $ARGV[0];
# read existing kml file
open( INFO, $file ); # Open the file
#lines = <INFO>; # Read it into an array
close(INFO); # Close the file
#print #lines; # Print the array
$x = 0;
$coord_string = "<coordinates>";
# go through each line looking for above string
foreach $line (#lines) {
$x++;
if ( $x > 12 ) {
if ( $line =~ $coord_string ) {
$thisCooordString = $line;
$var_startX = $x;
print "Found coord string: $thisCoordString\n";
print " on line: $var_startX\n";
}
}
}
The file that it's reading is here
and this is the output I get:
-bash-4.3$ perl writekml.pl HUC8short.kml
Found coord string:
on line: 25
Found coord string:
on line: 38
Is there some cap on the maximum length that a string can be in Perl? The longest line in this file is ~151,000 characters long. I've verified that all the lines in the file are read successfully.
You've misspelled the variable name (two os vs three os):
$thisCooordString = $line;
...
print "Found coord string: $thisCoordString\n";
Add use strict and use warnings to your script to prevent these sorts of errors.
Always include use strict and use warnings in EVERY perl script.
If you had done this, you would've gotten the following error message to clue you into your bug:
Global symbol "$thisCoordString" requires explicit package name
Adding these pragmas and simplifying your code results in the following:
#!/usr/bin/env perl
# Strips unsupported tags out of a QGIS-generated kml and writes a new one
use strict;
use warnings;
local #ARGV = 'HUC8short.kml';
while (<>) {
if ( $. > 12 && /<coordinates>/ ) {
print "Found coord string: $_\n";
print " on line: $.\n";
}
}
You can even try with perl one liners as shown below:
Perl One liner on windows command prompt:
perl -lne "if($_ =~ /<coordinates>/is && $. > 12) { print \"Found coord string : $_ \n"; print \" on line : $. \n\";}" HUC8short.kml
Perl One liner on unix prompt:
perl -lne 'if($_ =~ /<coordinates>/is && $. > 12) { print "Found coord string : $_ \n"; print " on line : $. \n";}' HUC8short.kml
As others have pointed out, you need. No, you MUST always use use strict; and use warnings;.
If you used strict, you would have gotten an error message telling you that your variable $thisCoordString or $thisCooordString was not declared with my. Using warnings would have warned you that you're printing an undefined string.
Your whole program is written in a very old (and obsolete) Perl programming style. This is the type of program writing I would have done back in Perl 3.0 days about two decades ago. Perl has changed quite a bit since then, and using the newer syntax will allow you to write easier to read and maintain programs.
Here's your basic program written in a more modern syntax:
#! /usr/bin/env perl
#
use strict; # Lets you know when you misspell variable names
use warnings; # Warns of issues (using undefined variables
use feature qw(say); # Let's you use 'say' instead of 'print' (No \n needed)
use autodie; # Program automatically dies on bad file operations
use IO::File; # Lots of nice file activity.
# Make Constants constant
use constant {
COORD_STRING => qr/<coordinates>/, # qr is a regular expression quoted string
};
my $file = shift;
# read existing kml file
open my $fh, '<', $file; # Three part open with scalar filehandle
while ( my $line = <$fh> ) {
chomp $line; # Always "chomp" on read
next unless $line =~ COORD_STRING; #Skip non-coord lines
say "Found coord string: $line";
say " on line: " . $fh->input_line_number;
}
close $fh;
Many Perl developers are self taught. There is nothing wrong with that, but many people learn Perl from looking at other people's obsolete code, or from reading old Perl manuals, or from developers who learned Perl from someone else back in the 1990s.
So, get some books on Modern Perl and learn the new syntax. You might also want to learn about things like references which can lead you to learn Object Oriented Perl. References and OO Perl will allow you to write longer and more complex programs.

How to get Perl to loop over all files in a directory?

I have a Perl script with contains
open (FILE, '<', "$ARGV[0]") || die "Unable to open $ARGV[0]\n";
while (defined (my $line = <FILE>)) {
# do stuff
}
close FILE;
and I would like to run this script on all .pp files in a directory, so I have written a wrapper script in Bash
#!/bin/bash
for f in /etc/puppet/nodes/*.pp; do
/etc/puppet/nodes/brackets.pl $f
done
Question
Is it possible to avoid the wrapper script and have the Perl script do it instead?
Yes.
The for f in ...; translates to the Perl
for my $f (...) { ... } (in the case of lists) or
while (my $f = ...) { ... } (in the case of iterators).
The glob expression that you use (/etc/puppet/nodes/*.pp) can be evaluated inside Perl via the glob function: glob '/etc/puppet/nodes/*.pp'.
Together with some style improvements:
use strict; use warnings;
use autodie; # automatic error handling
while (defined(my $file = glob '/etc/puppet/nodes/*.pp')) {
open my $fh, "<", $file; # lexical file handles, automatic error handling
while (defined( my $line = <$fh> )) {
do stuff;
}
close $fh;
}
Then:
$ /etc/puppet/nodes/brackets.pl
This isn’t quite what you asked, but another possibility is to use <>:
while (<>) {
my $line = $_;
# do stuff
}
Then you would put the filenames on the command line, like this:
/etc/puppet/nodes/brackets.pl /etc/puppet/nodes/*.pp
Perl opens and closes each file for you. (Inside the loop, the current filename and line number are $ARGV and $. respectively.)
Jason Orendorff has the right answer:
From Perlop (I/O Operators)
The null filehandle <> is special: it can be used to emulate the behavior of sed and awk, and any other Unix filter program that takes a list of filenames, doing the same to each line of input from all of them. Input from <> comes either from standard input, or from each file listed on the command line.
This doesn't require opendir. It doesn't require using globs or hard coding stuff in your program. This is the natural way to read in all files that are found on the command line, or piped from STDIN into the program.
With this, you could do:
$ myprog.pl /etc/puppet/nodes/*.pp
or
$ myprog.pl /etc/puppet/nodes/*.pp.backup
or even:
$ cat /etc/puppet/nodes/*.pp | myprog.pl
take a look at this documentation it explains all you need to know
#!/usr/bin/perl
use strict;
use warnings;
my $dir = '/tmp';
opendir(DIR, $dir) or die $!;
while (my $file = readdir(DIR)) {
# We only want files
next unless (-f "$dir/$file");
# Use a regular expression to find files ending in .pp
next unless ($file =~ m/\.pp$/);
open (FILE, '<', $file) || die "Unable to open $file\n";
while (defined (my $line = <FILE>)) {
# do stuff
}
}
closedir(DIR);
exit 0;
I would suggest to put all filenames to array and then use this array as parameters list to your perl method or script. Please see following code:
use Data::Dumper
$dirname = "/etc/puppet/nodes";
opendir ( DIR, $dirname ) || die "Error in opening dir $dirname\n";
my #files = grep {/.*\.pp/} readdir(DIR);
print Dumper(#files);
closedir(DIR);
Now you can pass \#files as parameter to any perl method.
my #x = <*>;
foreach ( #x ) {
chomp;
if ( -f "$_" ) {
print "process $_\n";
# do stuff
next;
};
};
Perl can shell out to execute system commands in various ways, the most straightforward is using backticks ``
use strict;
use warnings FATAL => 'all';
my #ls = `ls /etc/puppet/nodes/*.pp`;
for my $f ( #ls ) {
open (my $FILE, '<', $f) || die "Unable to open $f\n";
while (defined (my $line = <$FILE>)) {
# do stuff
}
close $FILE;
}
(Note: you should always use strict; and use warnings;)

How can I consolidate several Perl one-liners into a single script?

I would like to move several one liners into a single script.
For example:
perl -i.bak -pE "s/String_ABC/String_XYZ/g" Cities.Txt
perl -i.bak -pE "s/Manhattan/New_England/g" Cities.Txt
Above works well for me but at the expense of two disk I/O operations.
I would like to move the aforementioned logic into a single script so that all substitutions are effectuated with the file opened and edited only once.
EDIT1: Based on your recommendations, I wrote this snippet in a script which when invoked from a windows batch file simply hangs:
#!/usr/bin/perl -i.bak -p Cities.Txt
use strict;
use warnings;
while( <> ){
s/String_ABC/String_XYZ/g;
s/Manhattan/New_England/g;
print;
}
EDIT2: OK, so here is how I implemented your recommendation. Works like a charm!
Batch file:
perl -i.bal MyScript.pl Cities.Txt
MyScript.pl
#!/usr/bin/perl
use strict;
use warnings;
while( <> ){
s/String_ABC/String_XYZ/g;
s/Manhattan/New_England/g;
print;
}
Thanks a lot to everyone that contributed.
The -p wraps the argument to -E with:
while( <> ) {
# argument to -E
print;
}
So, take all the arguments to -E and put them in the while:
while( <> ) {
s/String_ABC/String_XYZ/g;
s/Manhattan/New_England/g;
print;
}
The -i sets the $^I variable, which turns on some special magic handling ARGV:
$^I = "bak";
The -E turns on the new features for that versions of Perl. You can do that by just specifying the version:
use v5.10;
However, you don't use anything loaded with that, at least in what you've shown us.
If you want to see everything a one-liner does, put a -MO=Deparse in there:
% perl -MO=Deparse -i.bak -pE "s/Manhattan/New_England/g" Cities.Txt
BEGIN { $^I = ".bak"; }
BEGIN {
$^H{'feature_unicode'} = q(1);
$^H{'feature_say'} = q(1);
$^H{'feature_state'} = q(1);
$^H{'feature_switch'} = q(1);
}
LINE: while (defined($_ = <ARGV>)) {
s/Manhattan/New_England/g;
}
continue {
die "-p destination: $!\n" unless print $_;
}
-e syntax OK
You can put arguments on the #! line. Perl will read them, even on Windows.
#!/usr/bin/perl -i.bak -p
s/String_ABC/String_XYZ/g;
s/Manhattan/New_England/g;
or you can keep it a one-liner as #ephemient said in the comments.
perl -i.bak -pE "s/String_ABC/String_XYZ/g; s/Manhattan/New_England/g" Cities.Txt
-i + -p basically puts a while loop around your program. Each line comes in as $_, your code runs, and $_ is printed out at the end. Repeat. So you can have as many statements as you want.

Resources