I've been trying to extend the first answer at Perl Monks (http://www.perlmonks.org/?node_id=735923) to a threaded model to no avail. I keep getting issues with not being able to pass a coderef
In my superclass I define the threadpool as a package variable so it can be shared amongst the subclasses:
package Things::Generic;
my $Qwork = new Thread::Queue;
my $Qresults = new Thread::Queue;
my #pool = map { threads->create(\&worker, $Qwork, $Qresults) } 1..$MAX_THREADS;
sub worker {
my $tid = threads->tid;
my( $Qwork, $Qresults ) = #_;
while( my $work = $Qwork->dequeue ) {
my $result = $work->process_thing();
$Qresults->enqueue( $result );
}
$Qresults->enqueue( undef ); ## Signal this thread is finished
}
sub enqueue {
my $self = shift;
$Qwork->enqueue($self);
}
sub new {
#Blessing and stuff
}
.
.
Now for the subclasses. It is guaranteed that they have a process_thing() method.
package Things::SpecificN;
use base qw (Things::Generic);
sub new() {
#instantiate
}
sub do_things {
my $self = shift;
#enqueue self into the shared worker pool so that "process_thing" is called
$self->enqueue();
}
sub process_thing() {
#Do some work here
return RESULT;
}
#
Main
my #things;
push #things, Things::Specific1->new();
push #things, Things::Specific2->new();
.
.
push #things, Things::SpecificN->new();
#Asynchronously kick off "work"
foreach my $thing (#things) {
$thing->do_things();
}
My goal is to put a list of "work" on the queue. Each thread will pull work from the queue and execute it, no matter what it. Each Thing has it's own unique work, however the function to do the work will be guaranteed to be called "process_thing". I just want the thread pool to grab an entry from the queue and do the "something". I think I am describing functionality similar to Android AsyncTask.
My Perl is not high enough for Thread::Queue::Any
$Qwork->enqueue($self); instead of $self->enqueue();
Related
I have a script that creates a queue and some workers that are reading its jobs from the queue. My problem is now that the script does not terminate and call printData() because the threads are idling. And this because I have not set the queue to undef.
I have tried many different ways but all lead to various problems.
Either queue was terminated although there were still jobs in the queue
Or there were no jobs in the queue at the moment although there was still a thread working and trying to push new work into the queue.
I use the following code
# -------------------------
# Main
# -------------------------
my #threads = map threads->create(\&doOperation), 1 .. $maxNumberOfParallelJobs;
pullDataFromDbWithDirectory($directory);
#$worker->enqueue((undef) x $maxNumberOfParallelJobs);
$_->join for #threads;
sub pullDataFromDbWithDirectory {
my $_dir = $_[0];
if ($itemCount <= $maxNumberOfItems) {
my #retval = grep { /^Dir|^File/ } qx($omnidb -filesystem $filesystem '$label' -listdir '$_dir');
foreach my $item (#retval) {
$itemCount++;
(my $filename = $item) =~ s/^File\s+|^Dir\s+|\n//g;
my $file = "$_dir/$filename";
push(#data,$file);
if ($item =~ /^Dir/) {
$worker->enqueue($file);
print "Add $file to queue\n" if $debug;
}
}
}
}
sub doOperation () {
my $ithread = threads->tid();
do {
my $folder = $worker->dequeue();
print "Read $folder from queue with thread $ithread\n" if $debug;
pullDataFromDbWithDirectory($folder);
} while ($worker->pending());
push(#IDLE_THREADS,$ithread);
}
EDIT:
I found an ugly solution. Maybe there is better ones? I add the workers into an IDLE array and sleep until all the workers are in there
sleep 0.01 while (scalar #IDLE_THREADS < $maxNumberOfParallelJobs);
$worker->enqueue((undef) x $maxNumberOfParallelJobs);
$_->join for #threads;
You can't use ->pending() without having threads die off prematurely. Fix:
my $busy: shared = $num_workers;
sub pullDataFromDbWithDirectory {
my $tid = threads->tid();
while (defined( my $folder = $q->dequeue() )) {
{ lock $busy; ++$busy; }
print "Worker thread $tid processing folder $folder.\n" if $debug;
pullDataFromDbWithDirectory($folder);
{ lock $busy; --$busy; }
}
print "Worker thread $tid exiting.\n" if $debug;
}
sleep 0.01 while $q->pending || $busy;
$worker->end();
$_->join for #threads;
But that introduces a race condition.
A worker thread dequeues the last item currently in the queue
main thread checks pending (false)
main thread checks number of busy threads (none)
main thread signals workers to end
All other worker threads exit.
The worker that dequeued the item above marks itself busy
The worker starts processing last item, tries to adding a bunch of items in the queue and fails.
The dequeuing plus the busy incrementing needs to be atomic, and the pending check plus the busy check needs to be atomic.
That's not possible to do without changing Thread::Queue. You can't just throw a lock around those two piece of code, cause that would prevent the master from checking if all of the threads are idle when one of them is idle.
We need to split ->dequeue into its waiting component and its dequeuing component. We have the latter (->dequeue_nb), so we just need the former.
use Thread::Queue 3.01;
sub T_Q_wait {
my $self = shift;
lock(%$self);
my $queue = $$self{'queue'};
my $count = #_ ? $self->_validate_count(shift) : 1;
# Wait for requisite number of items
cond_wait(%$self) while ((#$queue < $count) && ! $$self{'ENDED'});
cond_signal(%$self) if (#$queue);
return !$$self{'ENDED'};
}
Now we can write the solution:
my $busy: shared = 0;
sub pullDataFromDbWithDirectory {
my $tid = threads->tid();
WORKER_LOOP:
while (T_Q_wait($q)) {
my $folder;
{
lock $busy;
$folder = $q->dequeue_nb();
next WORKER_LOOP if !defined($folder);
++$busy;
}
print "Worker thread $tid processing folder $folder.\n" if $debug;
pullDataFromDbWithDirectory($folder);
{
lock $busy;
--$busy;
cond_signal($busy) if !$busy;
}
}
}
{
lock $busy;
cond_wait($busy) while $busy;
$q->end();
$_->join() for threads->list();
}
The next is there in case another thread snagged the work between wait and dequeue_nb.
I originally experimented with trying to send a hash object through Thread::Queue, but according to this link, my versions of Thread::Queue and threads::shared is too old. Unfortunately, since the system I'm testing on isn't mine, I can't upgrade.
I then tried to use a common array to store my hashes. Here is the code so far:
#!/usr/bin/perl
use strict;
use warnings;
use threads;
use Thread::Queue;
use constant NUM_WORKERS => 10;
my #out_array;
test1();
sub test1
{
my $in_queue = Thread::Queue->new();
foreach (1..NUM_WORKERS) {
async {
while (my $job = $in_queue->dequeue()) {
test2($job);
}
};
}
my #sentiments = ("Axe Murderer", "Mauler", "Babyface", "Dragon");
$in_queue->enqueue(#sentiments);
$in_queue->enqueue(undef) for 1..NUM_WORKERS;
$_->join() for threads->list();
foreach my $element (#out_array) {
print "element: $element\n";
}
}
sub test2
{
my $string = $_[0];
my %hash = (Skeleton => $string);
push #out_array, \%hash;
}
However, at the end of the procedure, #out_array is always empty. If I remove the threading parts of the script, then #out_array is correctly populated. I suspect I'm implementing threading incorrectly here.
How would I correctly populate #out_array in this instance?
You need to make it shared
use threads::shared;
my #out_array :shared;
I don't think you need to lock it if all you do is push onto it, but if you did, you'd use
lock #out_array;
You need to share any array or hash referenced by a value you push onto it using the tools in thread::shared.
push #out_array, share(%hash);
Though as I mentioned earlier, I'd use a Thread::Queue.
sub test2 {
my ($string) = #_;
my %hash = ( Skeleton => $string );
return \%hash;
}
...
my $response_q = Thread::Queue->new()
my $running :shared = NUM_WORKERS;
...
async {
while (my $job = $request_q->dequeue()) {
$response_q->enqueue(test2($job));
}
{ lock $running; $response_q->enqueue(undef) if !--$running; }
};
...
$request_q->enqueue(#sentiments);
$request_q->enqueue(undef) for 1..NUM_WORKERS;
while (my $response = $response_q->dequeue()) {
print "Skeleton: $response->{Skeleton}\n";
}
$_->join() for threads->list();
Note that lack of anything thread-specific in test2. This is good. You should always strive for separation of concerns.
You need to return your data from thread:
....
async {
my $data;
while (my $job = $in_queue->dequeue()) {
$data = test2($job);
}
return $data;
};
...
for ( threads->list() ) {
my $data = $_->join();
#now you have this thread return value in $data
}
sub test2
{
my $string = $_[0];
my %hash = (Skeleton => $string);
return \%hash;
}
I found my answer in the example here.
I had to change 2 things:
share the #out_array outside both subs
share the %hash in test2
add return; to the end of test2
Code outside both subs:
my #out_array : shared = ();
test2 sub:
sub test2
{
my $string = $_[0];
my %hash : shared;
$hash{Skeleton} = $string;
push #out_array, \%hash;
return;
}
I'm having issues with trying to put $self into the thread queue. Perl complains about CODE refs. Is it possible to put an object instance onto the thread queue?
generic.pm (Superclass)
package Things::Generic;
use Thread::Queue;
use threads;
our $work_queue = new Thread::Queue;
our $result_queue = new Thread::Queue;
my #worker_pool = map { threads->create (\&delegate_task, $work_queue, $result_queue) } 1 .. $MAX_THREADS;
sub delegate_task {
my( $Qwork, $Qresults ) = #_;
while( my $work = $Qwork->dequeue ) {
#The item on the queue contains "self" taht was passed in,
# so call it's do_work method
$work->do_work();
$Qresults->enqueue( "lol" );
}
$Qresults->enqueue( undef ); ## Signal this thread is finished
}
sub new {
my $class = shift;
my $self = {
_options => shift,
};
bless $self, $class;
return $self;
}
.
.
.
#other instance methods
#
object.pm (Subclass)
package Things::Specific;
use base qw ( Things::Generic )
sub new {
my $class = shift;
my $self = $class->SUPER::new(#_);
return $self;
}
sub do_stuff {
my $self = shift;
$Things::Generic::work_queue->enqueue($self);
}
sub do_work {
print "DOING WORK\n";
}
It's not objects it has a problem with; it's with a code ref within. That's not unreasonable. Why are you trying to share objects with code refs? You should be sharing data between threads, not code.
While I'm not certain of this, the likely root cause is not that you're passing an object, but that the object in question is storing an anonymous coderef in it (a callback, iterator, or the like). You may be able to refactor the object to eliminate this or perform some sort of serialization that allows it to recreate the coderef in the other thread.
I have an array which contains a list of file #arr=(a.txt,b.txt,c.txt);
I am iterating the array and processing the files with foreach loop; each line of the file will generate a sql and will run on the DB server.
I want to create one thread with each line of the file and query the DB. I also want to control the max no of threads at a time running simultaneously.
You can use a Thread::Pool based system. Or any Boss/Worker model based system.
That's just a simple worker model, an ideal scenario. No problem.
use threads;
use Thread::Queue qw( );
use constant NUM_WORKERS => 5;
sub work {
my ($dbh, $job) = #_;
...
}
{
my $q = Thread::Queue->new();
my #threads;
for (1..NUM_WORKERS) {
push #threads, async {
my $dbh = ...;
while (my $job = $q->dequeue())
work($dbh, $job);
}
};
}
while (<>) {
chomp;
$q->enqueue($_);
}
$q->enqueue(undef) for 1..#threads;
$_->join() for #threads;
}
Pass the file names to the script as arguments, or assign them to #ARGV within the script.
local #ARGV = qw( a.txt b.txt c.txt );
Interesting I manually control how many threads to run. I use Hash of the thread id
[code snip]
my %thr; #my hashes for threads
$count=1;
$maxthreads=5;
while (shift (#data) {
$syncthr = threads->create(sub {callfunction here}, {pass variables});
$tid = $syncthr->tid; #get the thread ID
$thr{$tid} = $syncthr;
if ($count >= $maxthreads) {
threads->yield();
while (1) { # loop until threads are completed
$num_run_threads = keys (%thr);
foreach $one_thread ( keys %thr ) {
if ($thr{$one_thread}->is_running() ) { # if thread running check for error state
if ($err = $thr{$one_thread}->error() } {
[ do stuff here]
}
# otherwise move on to next thread check
} else { # thread is either completed or has error
if ($err = $thr{$one_thread}->error()) {
[ check for error again cann't hurt to double check ]
}
if ($err = $thr{$one_thread}->join()) {
print "Thread completed id: [$one_thread]\n";
}
delete $thr{$one_thread}; # delete the hash since the thread is no more
$num_run_threads = $num_run_threads - 1; # reduce the number of running threads
}
} # close foreach loop
#threads = threads->list(threads::running); # get threads
if ($num_run_threads < $maxthreads ) {
$count = $num_run_threads; # reset the counter to number of threads running
if ( $#data != -1 ) { # check to make sure we still have data
last; # exit the infinite while loop
} else {
if (#threads) {
next; # we still have threads continue with processing
} else {
{ no more threads to process exit program or do something else }
}
} # end else
} # end threads running
} # end the while statement
#Check the threads to see if they are joinable
undef #threads;
#threads = threads->joinable()
if (#threads) {
foreach $mthread(#threads) {
if ($mthreads != 0) {
$thr->join();
}
} #end foreach
} #end #threads
} #end the if statement
$count++; Increment the counter to get to number of max threads to spawn
}
This is by no means a complete program. Furthermore, I have changed it to be very bland. However, I've been using this for a while with success. Especially in the OO Perl. This works for me and have quite a lot of uses. I maybe missing a few more error checking especially with timeout but I do that in the thread itself. Which by the way the thread is actually a sub routine that I am calling.
i wrote a code and i need to make it multithreaded. Evething works, but every loop repeats 4 times:
use LWP::UserAgent;
use HTTP::Cookies;
use threads;
use threads::shared;
$| = 1;
$threads = 4;
my #groups :shared = loadf('groups.txt');
my #thread_list = ();
$thread_list[$_] = threads->create(\&thread) for 0 .. $threads - 1;
$_->join for #thread_list;
thread();
sub thread
{
my $url = 'http://www.site.ru/';
my $response = $web->post($url, Content =>
['st.redirect' => ''
]);
foreach $i (#groups)
{
my $response = $web->get($i);
if(!($response->header('Location')))
{
---------;
}
else
{
----------;
}
}
}
sub loadf {
open (F, "<".$_[0]) or erroropen($_[0]);
chomp(my #data = <F>);
close F;
return #data;
}
groups.txt :
http://www.odnoklassniki.ru/group/47357692739634
http://www.odnoklassniki.ru/group/56099517562922
I understand that i need to use threads::shared; but i can't undestand how to use it.
Your post does not have much context to explain the code sections; please explain your scenario more clearly.
The problem is that you never remove from #groups, so all threads do all jobs in #groups.
Here's one solution.
use threads;
use Thread::Queue 3.01 qw( );
my $NUM_WORKERS = 4;
sub worker {
my ($url) = #_;
... download the page ...
}
my $q = Thread::Queue->new();
for (1..$NUM_WORKERS) {
async {
while (my $url = $q->dequeue()) {
worker($url);
}
};
}
$q->enqueue($_) for loadf('groups.txt');
$q->end();
$_->join() for threads->list;
Why do you need to make it threaded? perl does much better using forks in most cases.
That said, your code starts 4 threads, each of which processes everything in #groups. It sounds like that's not what you want to do. If you want #groups to be a queue of work to do, take a look at Thread::Queue (or Parallel::ForkManager).