Warnings from perl threads::shared - multithreading

My pseudo code looks like this:
#!/usr/local/bin/perl5.8.8
use warnings;
use strict;
use threads;
use threads::shared;
sub tasker;
my #allThreads = ();
my #array = ('alpha','beta','gamma');
push #allThreads, threads->new(\&tasker, #array);
$_->join foreach #allThreads;
sub tasker{
my #localArray = #_;
...call some other modules/functions...
}
While the threads are running, I get these messages after a few seconds on my STDOUT:
Still here!
Still here!
Still here!
After which the threads join (complete) successfully. I am not sure where these are coming from and why they show up only for some #array. A point to mention is that the number of these messages is equal to the elements in #array.
Will appreciate any help from experts.

Your code (or one of the module you are using) appears to have some leftover debugging code. To locate it, add
INIT { print "$0\n"; print "$_\n" for values %INC; exit }
to your script. Pipe the output to
xargs grep 'Still here!'
Then remove the debugging code.
PS - If you use warn without a trailing newline, your debugging messages will have a file name and line number attached. This can be useful :)

Related

Perl on Linux: change locale for subprocesses

What is the correct way to change the locale for a subprocess (in Linux)?
Example, when running
perl -e 'use POSIX qw(setlocale); setlocale(POSIX::LC_ALL, "C"); open F, "locale|"; while (<F>) { print if /LC_MESS/ }; close F'
I get the answer LC_MESSAGES="ca_ES.UTF-8" but I would like to obtain LC_MESSAGES="C". Whatever I've tried I can't seem to change it.
Note: I know about doing LC_ALL=C perl ..... but this is not what I want todo, I neet to change the locale inside the Perl script.
I'm picking up on Ted Lyngmo's comment, so credit goes to him.
You can set the environment for your code as well as subsequent sub-processes with %ENV. As with all global variables, it makes sense to only change these locally, temporarily, for your scope and smaller scopes. That's what local does.
I've also changed your open to use the three-arg form as that's more secure (even though you're not using a variable for the filename/command), and used a lexical filehandle. The lexical handle will go out of scope at the end of the block and close implicitly.
use strict;
use warnings;
use POSIX qw(setlocale);
{
setlocale(POSIX::LC_ALL, "C");
local $ENV{LC_ALL} = 'C';
open my $fh, '-|', 'locale' or die $!;
while (<$fh>) {
print if /LC_MESS/
};
}

Convert an excel file to txt and open in perl

I have an excel file with my data. I saved it as a tab delimited txt file.
But if I do a simple perl script:
open(IN, '<', 'myfile.txt') or die;
while (defined(my $line = <IN>)){
print "$line\n";
}
close IN;
it only prints out one line, but it contains all the data - just in one line
If I use another data file, there are no problems, so i think there is a problem convertin the excel file to a txt file.
can anybody help me?
try while (<IN>) instead. Your condition beats the while magic..
I'd change the loop to:
while(my $line = <IN>) { ... }
There's no need to use defined().
I am not sure if have this answered yet. But, first make sure you have the following in your code:
use strict;
use warnings;
This will give you debugging help that you would receive otherwise. Using the above will give you more messages that can help.
When I put your open command in a current program I am working on I received this debugging message:
Name "main::IN" used only once: possible typo at ./test.pl line 37
You also may want to use a file handle so Perl can remember where go. This is the "new" way to open files in Perl and is explained on the online perldoc. Just search for "perl file handle open." I learned to do my open's this way:
open my $in '<', 'myfile.txt' or die;
Then, you can just run the following:
while ( my $line = <$in> ) { ... }
There is a better way to do this if you ever have been introduced to Perl's default variable, yet I don't think that you have so the above solution may be the best.

Perl string replacements of file paths in text file

I'm trying to match file paths in a text file and replace them with their share file path. E.G. The string "X:\Group_14\Project_Security" I want to replace with "\\Project_Security$".
I'm having a problem at getting my head around the syntax, as I have use the backslash (\) to escape another backslash (\\) but this does not seem to work for matching a path in a text file.
open INPUT, '< C:\searchfile.txt';
open OUTPUT, '> C:\logsearchfiletest.txt';
#lines = <INPUT>;
%replacements = (
"X:\\Group_14\\Project_Security" => "\\\\Project_Security\$",
...
(More Paths as above)
...
);
$pattern = join '|', keys %replacements;
for (#lines) {
s/($pattern)/#{[$replacements{$1}]}/g;
print OUTPUT;
}
Not totally sure whats happening as "\\\\Project_Security\$" appears as \\Project_Security$" correctly.
So I think the issues lies with "X:\\Group_14\\Project_Security" not evaluating to
"X:\Group_14\Project_Security" correctly therefore not match within the text file?
Any advice on this would be appreciated, Cheers.
If all the file paths and replacements are in a similar format to your example, you should just be able to do the following rather than using a hash for looking up replacements:
for my $line (#lines) {
$line =~ s/.+\\(.+)$/\\\\$1\$/;
print OUTPUT $line;
}
Some notes:
Always use the 3-argument open
Always check for errors on open, print, or close
Sometimes is easier to use a loop than clever coding
Try:
#!/usr/bin/env perl
use strict;
use warnings;
# --------------------------------------
use charnames qw( :full :short );
use English qw( -no_match_vars ); # Avoids regex performance penalty
use Data::Dumper;
# Make Data::Dumper pretty
$Data::Dumper::Sortkeys = 1;
$Data::Dumper::Indent = 1;
# Set maximum depth for Data::Dumper, zero means unlimited
local $Data::Dumper::Maxdepth = 0;
# conditional compile DEBUGging statements
# See http://lookatperl.blogspot.ca/2013/07/a-look-at-conditional-compiling-of.html
use constant DEBUG => $ENV{DEBUG};
# --------------------------------------
# place file names in variables to they are easily changed
my $search_file = 'C:\\searchfile.txt';
my $log_search_file = 'C:\\logsearchfiletest.txt';
my %replacements = (
"X:\\Group_14\\Project_Security" => "\\\\Project_Security\$",
# etc
);
# use the 3-argument open as a security precaution
open my $search_fh, '<', $search_file or die "could not open $search_file: $OS_ERROR\n";
open my $log_search_fh, '>', $log_search_file or die "could not open $log_search_file: $OS_ERROR\n";
while( my $line = <$search_fh> ){
# scan for replacements
while( my ( $pattern, $replacement ) = each %replacements ){
$line =~ s/\Q$pattern\E/$replacement/g;
}
print {$log_search_fh} $line or die "could not print to $log_search_file: $OS_ERROR\n";
}
# always close the file handles and always check for errors
close $search_fh or die "could not close $search_file: $OS_ERROR\n";
close $log_search_fh or die "could not close $log_search_file: $OS_ERROR\n";
I see you've posted my rusty Perl code here, how embarrassing. ;) I made an update earlier today to my answer in the original PowerShell thread that gives a more general solution that also handles regex metacharacters and doesn't require you to manually escape each of 600 hash elements: PowerShell multiple string replacement efficiency. I added the perl and regex tags to your original question, but my edit hasn't been approved yet.
[As I mentioned, since I've been using PowerShell for everything in recent times (heck, these days I prepare breakfast with PowerShell...), my Perl has gotten a tad dusty, which I see hasn't gone unnoticed here. :P I fixed several things that I noticed could be coded better when I looked at it a second time, which are noted at the bottom. I don't bother with error messages and declarations and other verbosity for limited use quick-and-dirty scripts like this, and I don't particularly recommend it. As the Perl motto goes, "making easy things easy and hard things possible". Well, this is a case of making easy things easy, and one of Perl's main advantages is that it doesn't force you to be "proper" when you're trying to do something quick and simple. But I did close the filehandles. ;)

How can I run stuff in the background and redirect their output in Perl?

my $pm = new Parallel::ForkManager(4);
foreach my $array (#lines) {
$pm->start and next;
$cmd = 'command';
print "\n$cmd\n\n";
exec($cmd);
$pm->finish; }
$pm->wait_all_children;
As you can see my code runs 4 things at once. It's ffmpeg piping video to x264. Its output is messy and jumps around on a single line between the 4 outputs. Is there a way to totally run these in the background and redirect its output so I can cleanly print and update the 4 separate outputs? It would be nice so I could tell how far along each process is.
If it absolutely can't be done in perl, I would gladly accept any help given to direct me to another language that would make this easier. This is under Linux, by the way. Thank you.
Open2 is way beyond me. How would I use this? I can't grasp how I will be able to print the progress of each thing without making new lines. I want to print the STDERR and STDOUT of whatever each process is doing, and when it ends, keep that as a line that doesn't update. It's not a good explanation, but I don't know how else to explain what I want. Basically, the first 4 jobs will have 4 lines constantly refreshing. Then when one of those jobs is done, add a new line for the new job, and maybe somehow indicate that the done job is done.
I tried a quick test with "open" and it still outputs to the shell. This is in Windows, but it should still behave the same. Is this going to even be possible with Perl, or even in a shell?
Hello? I still need help with this...
If you want to capture each process's STDOUT, you can use open instead of exec to run your subprocesses.
foreach my $array(#lines) {
$pm->start and next;
my $cmd = 'command';
open my $cmd_out, '-|', $cmd or die "Can't start process: $!";
# read from command output
while( my $line = <$cmd_out> ) {
# do something with output
}
$pm->finish;
}
If you need to capture both STDOUT and STDERR, see IPC::Open2. There's also IPC::Open3 if you need a handle for STDIN as well.

End open file wih perl for frequently updated reports

I have a daemon which needs to report a small hash of statistics to a file in a /dev/loop0 filesystem. I am using FileHandle to store the reference to the filehandle in perl. So a real small version of the problem looks like this:
#!/usr/bin/perl
use strict;
use warnings;
use FileHandle;
my $report = FileHandle->new("> /devfs/test");
print $report "Hello";
seek($report,0,0);
print $report "Hi";
$report->close();
The result from this will be Hillo, which is what I'd expect. What I'd like to do is be able to indicate after Hi (and really Hello), that the file is now finished.
Question: When reading from a file, you can just search for the end of file (EOF), but how can I indicate the end of a file on write without closing it? If it makes a difference, the solution needs to apply to Linux specifically.
You want the truncate function.
truncate($report, tell($report));
...will truncate the file to wherever the file pointer currently is (as reported by tell).

Resources