Synchronize multiple threads - multithreading

I'm trying to make a multithreaded application (up to lets say 100 thread) in Delphi XE5,
all the thread are going to use/change a listbox in the main form,
here is my code:
Main Unit:
private
//list to hold the dynamic created threads
SendThreads : TObjectList;
public
mutex : boolean;
...
procedure TMainForm.FormCreate(Sender: TObject);
begin
...
mutex := false;
SendThreads := TObjectList.Create;
end;
...
//Button to create and start the threads
procedure TMainForm.btnWorkClick(Sender: TObject);
var
Thread : TSendThread;
i, nbr : integer;
begin
if SendThreads.Count < spnThreadCount.Value then
begin
for I := 1 to spnThreadCount.Value - SendThreads.Count do
begin
nbr := SendThreads.Add(TSendThread.Create(true));
Thread := TSendThread(SendThreads.Items[nbr]);
Thread.FreeOnTerminate := true;
Thread.Start;
end;
end;
end;
Thread Unit:
uses MainUnit;
procedure TSendThread.Execute;
begin
QueryList;
end;
//Basically, this procedure checks if the item in the listbox contains '- Done.' which means that this
//item has been done by another thread, if not, the current thread deal with this item.
procedure TSendThread.QueryList;
var i : integer;
S : String;
begin
i := 0;
while i < MainForm.lstURL.Count do
begin
while MainForm.mutex do;
MainForm.mutex := true;
if pos(' - Done.', MainForm.lstURL.Items[i]) = 0 then
begin
S := MainForm.lstURL.Items[i];
Delete(S, 1, pos('?txt=', S) + length('?txt=') - 1);
MainForm.Memo1.Lines.Add(MainForm.lstURL.Items[i]);
MainForm.lstURL.Items[i] := MainForm.lstURL.Items[i] + ' - Done.';
MainForm.mutex := false;
SendAd(URL, S);
end
else
begin
Inc(i);
MainForm.mutex := false;
end;
end;
end;
This method works if the number of the threads are less than 4, but if it's more i get redundant result (2 or more threads do the same item).
Now i'm fairly new to threads and multithreading, and i was wondering if this is the right way to do it.

In addition to what Ken said about UI safety, you are getting threads processing the same item because you have a race condition on your mutex variable. Multiple threads may see mutex=false at the same time as they are not actually syncing with each otber. You need to use a true mutex. Look at the TMutex class. Or just wrap the majority of your code in TThread.Synchronize() and let the VCL handle the syncing for you. But then that defeats the purpose of using threads.
You are using the completely wrong design for your requirement. You need to separate your worker thread logic from your UI logic. There are a few different ways you can do that.
Put your work jobs into a thread-safe queue, like TThreadList<T> or TThreadedQueue<T>. Each thread can check the queue periodically and pull the next available job if there is one.
A. A variation is to use an I/O Completion Port as the queue. Post jobs to the IOCP using PostQueuedCompletionStatus(), and have each thread use GetQueuedCompletionResult() to receive jobs. This allows the OS to do all the queuing and pulling for you, while allowing threads to sleep when there are no jobs available.
Put your threads into a pool in a sleeping state. When a new job is ready, check the pool. If a thread is available, pull it from the pool, pass the job to it, and wake it up. Otherwise put the job in a thread-safe queue. When a job is done, have that thread check the queue. If a job is available, pull it from the queue, otherwise put the thread in the poll and put it back to sleep.
In any case, when a job is done being processed, that thread can notify the main thread of the job result, using TThread.Synchronize(), TThread.Queue(), or any other inter-thread communication of your choosing. The main thread can then update the UI as needed.
At no point should the threads touch the UI to discover new jobs.

Related

How to safely access and modify an array while multithreading?

I'm trying to manipulate a variable of type array of record, by multiple threads, and I'm not sure whether what i did is the right way, or if there is a better and safer method?
I declared a boolean variable as a lock, and when some thread want to access the array it waits till the lock is off, then activate the lock, and when done, unlock it and let others have access.
The code for this is declared in the implementation section
...
implementation
var Data : array of TData;
var Data_Lock:Boolean=false;
procedure Lock_Data();
begin
while Data_Lock = True do
sleep(1);
Data_Lock := True;
end;
procedure UnLock_Data();
begin
Data_Lock := False;
end;
procedure ClearAll();
begin
Lock_Data();
SetLength( Data, 0 );
UnLock_Data();
end;
....
The entire project is still not complete. For now this seems to work but i don't have any knowledge of how these things work at the core, and whether there would be a problem if two threads start at the exact same time?
Your locking approach is not thread safe and it will not protect your data.
With multiple threads you have to consider that any particular thread execution can be interrupted at any time and that another thread can "jump" in and access some variable in between.
That means following scenario is possible (simplified):
Data_Lock is False
Thread A enters Lock_Data()
Thread A checks Data_Lock -> False and skips the loop
Thread B enters Lock_Data()
Thread B checks Data_Lock -> False and skips the loop (Thread A didn't have the chance to set it to True yet)
Thread A continues -> sets Data_Lock to True and gains access to protected data
Thread B continues -> sets Data_lock to True and gains access to protected data while Thread A is still using that data
You can use TCriticalSection from System.SyncObjs instead.
var DataLock: TCriticalSection;
procedure ClearAll();
begin
DataLock.Enter;
try
SetLength(Data, 0);
finally
DataLock.Leave;
end;
end;
Since TCriticalSection is a class you need to create DataLock instance before you can use it and you need to free it when you no longer need it. For instance, you can do that in initialization/finalization section of a unit.
initialization
DataLock := TCriticalSection.Create;
finalization
DataLock.Free;
end.
However, the better approach would be to wrap your data and critical section together in a class.

How to make multiple threads do work on a single TStringList

I want to be able to make multiple threads do work on a single TStringList. This is my current Thread code for using a single thread to do the work;
type
TFilterThread = class(TThread)
protected
procedure Execute; override;
public
lCombos : TStringList;
//Public Vars
end;
procedure TFilterThread.Execute;
var
I: integer;
HTML: String;
frm1 : TForm1;
splitLogin: TStringList;
validCount: Integer;
begin
validCount := 0;
for I := 0 to lCombos.Count - 1 do
begin
if Terminated then Exit();
Unit1.Form1.ListBox1.ItemIndex := i;
try
HTML := Unit1.Form1.IdHTTP1.Get(EmailCheckURL + lCombos[i]);
if AnsiPos('You indicated you', HTML) > 0 then
begin
//Do stuff
end;
except
Continue;
end;
end;
Showmessage('Finished!');
Unit1.Form1.Button1.Caption := 'Start';
end;
To Start the thread I use;
lComboLsit := TStringList.Create;
for I := 0 to listBox1.Items.Count -1 do
lComboLsit.Add(listBox1.Items[i]);`
iTFilterThread := TFilterThread.Create(True);
iTFilterThread.FreeOnTerminate := True;
iTFilterThread.lCombos := lComboLsit;
iTFilterThread.Start;
How would I introduce another thread to also do work on the lCombos list so that the operation would complete quicker?
The answer is that it depends. If the string list is not modified, then there's nothing to do. If the string list is modified then what is best depends critically on your specific usage.
Looking at your code, your program may not be CPU bound and so adding more threads may not help. Your program's bottleneck will the HTTP communication. However, despite not being CPU bound it is plausible that running multiple threads will reduce the impact HTTP latency. You can benchmark to find out. You don't appear to be modifying the string list so there's no race problems to be concerned with there.
However, here's a problem:
Unit1.Form1.IdHTTP1.Get(EmailCheckURL + lCombos[i]);
That will work, and is thread safe so long as TIdHTTP is thread safe. But it's pretty ugly to allow a thread to access a component on a form like that. And I don't see any real sense or need to share the TIdHTTP instance between threads. It would be far cleaner to let each thread instantiate and use their own TIdHTTP component.
Of course, you will need to decide on a policy for dividing the work between all your threads. You could have a shared index that keeps track of the next item to process. Have each thread increment it atomically each time they take an item. A parallel for loop would be a good fit here. That's available in the latest version of Delphi, or in any decent third party parallel library.
You do have some problems with your code. In the thread procedure you do this:
Unit1.Form1.ListBox1.ItemIndex := i;
....
Unit1.Form1.Button1.Caption := 'Start';
You cannot access VCL components from a thread.
And then you do this
ShowMessage('Finished!');
Don't show UI from a thread.
A point of idiom. Instead of looping over the items in your list box you can simply do this:
lComboLsit.Assign(listBox1.Items);

Using the Delphi XE7 Parallel Library

I have a time consuming routine which I'd like to process in parallel using Delphi XE7's new parallel library.
Here is the single threaded version:
procedure TTerritoryList.SetUpdating(const Value: boolean);
var
i, n: Integer;
begin
if (fUpdating <> Value) or not Value then
begin
fUpdating := Value;
for i := 0 to Count - 1 do
begin
Territory[i].Updating := Value; // <<<<<< Time consuming routine
if assigned(fOnCreateShapesProgress) then
fOnCreateShapesProgress(Self, 'Reconfiguring ' + Territory[i].Name, i / (Count - 1));
end;
end;
end;
There is really nothing complex going on. If the territory list variable is changed or set to false then the routine loops around all the sales territories and recreates the territory border (which is the time consuming task).
So here is my attempt to make it parallel:
procedure TTerritoryList.SetUpdating(const Value: boolean);
var
i, n: Integer;
begin
if (fUpdating <> Value) or not Value then
begin
fUpdating := Value;
n := Count;
i := 0;
TParallel.For(0, Count - 1,
procedure(Index: integer)
begin
Territory[Index].Updating := fUpdating; // <<<<<< Time consuming routine
TInterlocked.Increment(i);
TThread.Queue(TThread.CurrentThread,
procedure
begin
if assigned(fOnCreateShapesProgress) then
fOnCreateShapesProgress(nil, 'Reconfiguring ', i / n);
end);
end
);
end;
end;
I've replaced the for-loop with a parallel for-loop. The counter, 'i' is locked as it is incremented to show progress. I then wrap the OnCreateShapeProgress event in a TThread.Queue, which will be handled by the main thread. The OnCreateShapeProgress event is handled by a routine which updates the progress bar and label describing the task.
The routine works if I exclude the call to the OnCreateShapeProgress event. It crashes with an EAurgumentOutOfRange error.
So my question is simple:
Am I doing anything anything stupid?
How to do you call an event handler from within a TParallel.For loop or TTask?
The most obvious problem that I can see is that you queue to the worker thread.
Your call to TThread.Queue passes TThread.CurrentThread. That is the very thread on which you are calling TThread.Queue. I think it is safe to say that you should never pass TThread.CurrentThread to TThread.Queue.
Instead, remove that parameter. Use the one parameter overload that just accepts a thread procedure.
Otherwise I'd note that the incrementing of the progress counter i is not really handled correctly. Well, the incrementing is fine, but you then read it later and that's a race. You can report progress out of order if thread 1 increments before thread 2 but thread 2 queues progress before thread 1. Solve that by moving the counter increment code to the main thread. Simply increment it inside the queued anonymous method. Added bonus to that is you no longer need to use an atomic increment since all modifications are on the main thread.
Beyond that, this QC report seems rather similar to what you report: http://qc.embarcadero.com/wc/qcmain.aspx?d=128392
Finally, AtomicIncrement is the idiomatic way to perform lock free incrementing in the latest versions of Delphi.

Updating each thread's timer in a TListView without CPU overload

Delphi used: 2007
Hello everyone,
I have a TListView with ViewStyle set to vsReport. When I click on a button, I launch about 50 threads. Each thread has a TListItem component. Each TListItem has a SubItem that is a timer. It starts at 250 and goes all the way down to 0. The user is able to see each timer decreasing in the TListView.
I have written the following code:
procedure TThreadWorker.DeleteTickets;
begin
ListItem.Delete;
end;
procedure TThreadWorker.UpdateTimer;
begin
ListItem.SubItems[1] := IntToStr(Timer);
end;
procedure TThreadWorker.TimerCounter;
begin
Timer := 300;
repeat
Sleep(1000);
Dec(Timer);
Synchronize(UpdateTimer);
until (Timer = 0);
Synchronize(DeleteTickets);
end;
And... it works! But here's the thing: all these synchronizations seem to unnecessarily overload the CPU. Obviously, it's a bigger problem when I launch more threads (100, 200 or 300) or when I use a weaker computer. At first, I wasn't sure it was the synchronizations; but if I deactivate them, the CPU is no more overloaded.
To be frank, it's not that much of an issue. However, I have the feeling that decrementing timers shouldn't cause any sort of CPU overload: my code is probably not right. I tried calling UpdateTimer less often, and while it softens the CPU overload, it doesn't fix it in the end. Furthermore, I'd like the user to see the timer updated each second. The timer also needs to be as precise as possible.
Thank you.
I think you have placed the cart ahead of the horse here. Having all your threads synchronize into the main thread, each with their own timer and message queue, will place a heavy burden on the system. What's more, often times you don't want to burden your threads with running a message loop.
A better approach in my view is to place a single timer in the main thread. When it ticks, have it retrieve progress from each thread or task that it needs to report on. You'll need to serialize access to that progress, but that's not expensive.
I think threads are more expensive to the CPU than you might think. From what I remember, the CPU has overhead to swap each thread in and out of the cache. With the CPU swapping out 50 different threads, I'm not surprised its overloading
One solution might be to extend the TTimer component then dynamically create 50 of those rather than 50 threads. TTimer uses the windows api instead of threads. (The code below is untested, but should at least give you the idead)
TMyTimer = class(TTimer)
begin
public
Timer: integer;
ListItem: TListItem;
end;
...
procedure ButtonClick(Sender: TObject)
begin
for i := 0 to 50 do
begin
ltimer = TMyTimer.Create;
ltimer.Timer := 300;
ltimer.ListItem := TListItem.Create;
//initialize list item here
ltimer.OnTimer := DecTimer;
end;
end;
procedure DecTimer(Sender: TObject)
begin
dec(TMytimer(Sender).Timer);
TMyTimer(Sender).ListItem.SubItem[1] := StrToInt(TMytimer(Sender).Timer)
end;
If the threads are all starting at the same time, try doing something like having one thread control up to 25 timer. i.e. For 50 timers you only have two threads. The timer event would then just loop through its 25 counters and decrement them. You'd still need to use synchronize for this.
The answer to this question might be of interest:
How expensive are threads?
Here is example using TThread.Queue and a TSimpleEvent for timing the counter instead of a Sleep().
Type
TThreadWorker = Class(TThread)
private
FTimer : Integer;
FListItem : TListItem;
procedure Execute; override;
procedure UpdateTimer;
procedure DeleteTicket;
public
constructor Create( aListItem : TListItem);
End;
constructor TThreadWorker.Create(aListItem : TListItem);
begin
Inherited Create(false);
FListItem := aListItem;
Self.FreeOnTerminate := true;
end;
procedure TThreadWorker.Execute;
var
anEvent : TSimpleEvent;
begin
anEvent := TSimpleEvent.Create(nil,true,false,'');
try
FTimer := 300;
repeat
anEvent.WaitFor(1000);
Queue(UpdateTimer);
Dec(FTimer);
until (FTimer = 0);
Self.Synchronize( DeleteTicket); // <-- Do not Queue this !
finally
anEvent.Free;
end;
end;
procedure TThreadWorker.UpdateTimer;
begin
FListItem.SubItems[1] := IntToStr(FTimer);
end;
procedure TThreadWorker.DeleteTicket;
begin
FListItem.Delete;
end;
Just a note, the DeleteTicket must be synchronized. The thread is terminated when execute is done, and anything on the queue will be left dangling.

Multithreaded file upload synchronization

Currently I am working on a Delphi XE3 client/server application to transfer files (with the Indy FTP components). The client part monitors a folder, gets a list of the files inside, uploads them to the server and deletes the originals. The uploading is done by a separate thread, which processes files one by one. The files can range from 0 to a few thousand and their sizes also vary a lot.
It is a Firemonkey app compiled for both OSX and Windows, so I had to use TThread instead of OmniThreadLibrary, which I preferred. My customer reports that the application randomly freezes. I could not duplicate it, but since I don't have so much experience with TThread, I might have put deadlock condition somewhere. I read quite a lot of examples, but I'm still not sure about some of the multithread specifics.
The app structure is simple:
A timer in the main thread checks the folder and gets information about each file into a record, which goes into a generic TList. This list keeps information about the names of the files, size, the progress, whether the file is completely uploaded or has to be retried. All that is displayed in a grid with progress bars, etc. This list is accessed only by the main thread.
After that the items from the list are sent to the thread by calling the AddFile method (code below). The thread stores all files in a thread-safe queue like this one http://delphihaven.wordpress.com/2011/05/06/using-tmonitor-2/
When the file is uploaded the uploader thread notifies the main thread with a call to Synchronize.
The main thread periodically calls the Uploader.GetProgress method to check the current file progress and display it. This function is not actually thread-safe, but could it cause a deadlock, or only wrong data returned?
What would be a safe and efficient way to do the progress check?
So is this approach OK or I have missed something? How would you do this?
For example I though of making a new thread just to read the folder contents. This means that the TList I use has to be made thread-safe, but it has to be accessed all the time to refresh the displayed info in the GUI grid. Wouldn't all the synchronization just slow down the GUI?
I have posted the simplified code below in case someone wants to look at it. If not, I would be happy to hear some opinions on what I should use in general. The main goals are to work on both OSX and Windows; to be able to display information about all the files and the progress of the current one; and to be responsive regardless of the number and size of the files.
That's the code of the uploader thread. I have removed some of it for easier reading:
type
TFileStatus = (fsToBeQueued, fsUploaded, fsQueued);
TFileInfo = record
ID: Integer;
Path: String;
Size: Int64;
UploadedSize: Int64;
Status: TFileStatus;
end;
TUploader = class(TThread)
private
FTP: TIdFTP;
fQueue: TThreadedQueue<TFileInfo>;
fCurrentFile: TFileInfo;
FUploading: Boolean;
procedure ConnectFTP;
function UploadFile(aFileInfo: TFileInfo): String;
procedure OnFTPWork(ASender: TObject; AWorkMode: TWorkMode; AWorkCount: Int64);
procedure SignalComplete;
procedure SignalError(aError: String);
protected
procedure Execute; override;
public
property Uploading: Boolean read FUploading;
constructor Create;
destructor Destroy; override;
procedure Terminate;
procedure AddFile(const aFileInfo: TFileInfo);
function GetProgress: TFileInfo;
end;
procedure TUploader.AddFile(const aFileInfo: TFileInfo);
begin
fQueue.Enqueue(aFileInfo);
end;
procedure TUploader.ConnectFTP;
begin
...
FTP.Connect;
end;
constructor TUploader.Create;
begin
inherited Create(false);
FreeOnTerminate := false;
fQueue := TThreadedQueue<TFileInfo>.Create;
// Create the TIdFTP and set ports and other params
...
end;
destructor TUploader.Destroy;
begin
fQueue.Close;
fQueue.Free;
FTP.Free;
inherited;
end;
// Process the whole queue and inform the main thread of the progress
procedure TUploader.Execute;
var
Temp: TFileInfo;
begin
try
ConnectFTP;
except
on E: Exception do
SignalError(E.Message);
end;
// Use Peek instead of Dequeue, because the item should not be removed from the queue if it fails
while fQueue.Peek(fCurrentFile) = wrSignaled do
try
if UploadFile(fCurrentFile) = '' then
begin
fQueue.Dequeue(Temp); // Delete the item from the queue if succesful
SignalComplete;
end;
except
on E: Exception do
SignalError(E.Message);
end;
end;
// Return the current file's info to the main thread. Used to update the progress indicators
function TUploader.GetProgress: TFileInfo;
begin
Result := fCurrentFile;
end;
// Update the uploaded size for the current file. This information is retrieved by a timer from the main thread to update the progress bar
procedure TUploader.OnFTPWork(ASender: TObject; AWorkMode: TWorkMode; AWorkCount: Int64);
begin
fCurrentFile.UploadedSize := AWorkCount;
end;
procedure TUploader.SignalComplete;
begin
Synchronize(
procedure
begin
frmClientMain.OnCompleteFile(fCurrentFile);
end);
end;
procedure TUploader.SignalError(aError: String);
begin
try
FTP.Disconnect;
except
end;
if fQueue.Closed then
Exit;
Synchronize(
procedure
begin
frmClientMain.OnUploadError(aError);
end);
end;
// Clear the queue and terminate the thread
procedure TUploader.Terminate;
begin
fQueue.Close;
inherited;
end;
function TUploader.UploadFile(aFileInfo: TFileInfo): String;
begin
Result := 'Error';
try
if not FTP.Connected then
ConnectFTP;
FUploading := true;
FTP.Put(aFileInfo.Path, ExtractFileName(aFileInfo.Path));
Result := '';
finally
FUploading := false;
end;
end;
And parts of the main thread that interact with the uploader:
......
// Main form
fUniqueID: Integer; // This is a unique number given to each file, because there might be several with the same names(after one is uploaded and deleted)
fUploader: TUploader; // The uploader thread
fFiles: TList<TFileInfo>;
fCurrentFileName: String; // Used to display the progress
function IndexOfFile(aID: Integer): Integer; //Return the index of the record inside the fFiles given the file ID
public
procedure OnCompleteFile(aFileInfo: TFileInfo);
procedure OnUploadError(aError: String);
end;
// This is called by the uploader with Synchronize
procedure TfrmClientMain.OnUploadError(aError: String);
begin
// show and log the error
end;
// This is called by the uploader with Synchronize
procedure TfrmClientMain.OnCompleteFile(aFileInfo: TFileInfo);
var
I: Integer;
begin
I := IndexOfFile(aFileInfo.ID);
if (I >= 0) and (I < fFiles.Count) then
begin
aFileInfo.Status := fsUploaded;
aFileInfo.UploadedSize := aFileInfo.Size;
FFiles.Items[I] := aFileInfo;
Inc(FFilesUploaded);
TFile.Delete(aFileInfo.Path);
colProgressImg.UpdateCell(I);
end;
end;
procedure TfrmClientMain.ProcessFolder;
var
NewFiles: TStringDynArray;
I, J: Integer;
FileInfo: TFileInfo;
begin
// Remove completed files from the list if it contains more than XX files
while FFiles.Count > 1000 do
if FFiles[0].Status = fsUploaded then
begin
Dec(FFilesUploaded);
FFiles.Delete(0);
end else
Break;
NewFiles := TDirectory.GetFiles(WatchFolder, '*.*',TSearchOption.soAllDirectories);
for I := 0 to Length(NewFiles) - 1 do
begin
FileInfo.ID := FUniqueID;
Inc(FUniqueID);
FileInfo.Path := NewFiles[I];
FileInfo.Size := GetFileSizeByName(NewFiles[I]);
FileInfo.UploadedSize := 0;
FileInfo.Status := fsToBeQueued;
FFiles.Add(FileInfo);
if (I mod 100) = 0 then
begin
UpdateStatusLabel;
grFiles.RowCount := FFiles.Count;
Application.ProcessMessages;
if fUploader = nil then
break;
end;
end;
// Send the new files and resend failed to the uploader thread
for I := 0 to FFiles.Count - 1 do
if (FFiles[I].Status = fsToBeQueued) then
begin
if fUploader = nil then
Break;
FileInfo := FFiles[I];
FileInfo.Status := fsQueued;
FFiles[I] := FileInfo;
SaveDebug(1, 'Add: ' + ExtractFileName(FFiles[I].Path));
FUploader.AddFile(FFiles[I]);
end;
end;
procedure TfrmClientMain.tmrGUITimer(Sender: TObject);
var
FileInfo: TFileInfo;
I: Integer;
begin
if (fUploader = nil) or not fUploader.Uploading then
Exit;
FileInfo := fUploader.GetProgress;
I := IndexOfFile(FileInfo.ID);
if (I >= 0) and (I < fFiles.Count) then
begin
fFiles.Items[I] := FileInfo;
fCurrentFileName := ExtractFileName(FileInfo.Path);
colProgressImg.UpdateCell(I);
end;
end;
function TfrmClientMain.IndexOfFile(aID: Integer): Integer;
var
I: Integer;
begin
Result := -1;
for I := 0 to FFiles.Count - 1 do
if FFiles[I].ID = aID then
Exit(I);
end;
This may not be the problem, but TFileInfo is a record.
This means that when passed as a (non const/var) parameter, it gets copied. This can result in issues with things like strings in the record which don't get reference counts updated when the record is copied.
One thing to try would be to make it a class and pass an instance as the parameter (i.e. a Pointer to the data on the heap).
Something else to watch out for is shared Int64's (e.g. your size values) on threaded 32bit systems.
Updating/reading these is not done atomically & you don't have any specific protections, so it is possible for a read of the value to get mismatched upper and lower 32-bits due to threading. (e.g. Read Upper 32 bits, Write Upper 32bits, Write lower 32bits, Read Lower 32bits, with reads & write in different threads). This is probably not causing the problems you are seeing and unless you are working with files transfers of > 4GB, unlikely to ever cause you any issues.
Deadlocks are definitely hard to spot, but this may be the problem.
In your code, I didn't see that you added any timeout to the enqueue, peek or dequeue - which means it will take the default of Infinite.
The enqueue has this line in it - meaning, like any synchronization object, it will block until either the Enter completes (it locks the monitor) or the Timeout occurs (since you don't have a timeout, it will wait forever)
TSimpleThreadedQueue.Enqueue(const Item: T; Timeout: LongWord): TWaitResult;
...
if not TMonitor.Enter(FQueue, Timeout)
I'm also going to make the assumption that you implemented PEEK yourself based on the Dequeue - only you don't actually remove the item.
That appears to implement its own timeout - however, you still have the following:
function TSimpleThreadedQueue.Peek/Dequeue(var Item: T; Timeout: LongWord): TWaitResult;
...
if not TMonitor.Enter(FQueue, Timeout)
Where timeout is Infinite - so, if you are in the peek method waiting for it to be signaled with an infinite timeout, then you can't Enqueue something from a second thread without blocking that thread waiting for the peek method to become complete on an infinite timeout.
Here is a snippet of the comment from TMonitor
Enter locks the monitor object with an optional timeout (in ms) value.
Enter without a timeout will wait until the lock is obtained.
If the procedure returns it can be assumed that the lock was acquired.
Enter with a timeout will return a boolean status indicating whether or
not the lock was obtained (True) or the attempt timed out prior to
acquire the lock (False). Calling Enter with an INFINITE timeout
is the same as calling Enter without a timeout.
Since the implementation uses Infinite by default, and a TMonitor.Spinlock value is not provided, that will block the thread until it can acquire the FQueue object.
My suggestion would be to change your code as follows:
// Use Peek instead of Dequeue, because the item should not be removed from the queue if it fails
while true do
case fQueue.Peek(fCurrentFile,10)
wrSignaled:
try
if UploadFile(fCurrentFile) = '' then
begin
fQueue.Dequeue(Temp); // Delete the item from the queue if succesful
SignalComplete;
end;
except
on E: Exception do
SignalError(E.Message);
end;
wrTimeout: sleep(10);
wrIOCompletion,
wrAbandoned,
wrError: break;
end; //case
This way, peek won't hold the lock on FQueue indefinitely, leaving a window for the Enqueue to acquire it and add the file from the main (UI) thread.
This might be a long shot, but here is another possibility [the former answer may be more likely] (something I just ran across, but had known before): The use of Synchronize may be causing the deadlock. Here is a blog about why this happens:
Delphi-Workaround-for-TThread-SynchronizeWaitFor-.aspx
The pertinent point from the article:
Thread A calls Synchronize(MethodA)
Thread B calls Synchronize(MethodB)
Then, inside the context of the Main Thread:
Main thread calls CheckSynchronize() while processing messages
CheckSynchronize is implemented to batch-process all waiting calls(*). So it picks up the
queue of waiting calls (containing MethodA and MethodB) and loops
through them one by one.
MethodA executes in the main thread's
context. Assume MethodA calls ThreadB.WaitFor
WaitFor calls
CheckSynchronize to process any waiting calls to Synchronize
In theory, this should then process ThreadB's Synchronize(MethodB),
allowing Thread B to complete. However, MethodB is already a
possession of the first CheckSynchronize call, so it never gets
called.
DEADLOCK!
Embarcadero QC article describing the problem in more detail.
While I don't see any ProcessMessages calls in the above code, or for that matter, a WaitFor that would be called during a Synchronize, it could still be a problem that at the point a synchronize is called, another thread calls the synchronize as well - but the main thread has already synchronized and is blocking.
This didn't click with me at first, because I tend to avoid Synchronize calls like the plague and usually design UI updates from threads using other methods like message passing and thread safe lists with message notification instead of synchronize calls.

Resources