I'm developping a multithreaded download application, the download works fine unless when I download a zip file: the content is downloaded but when I try to extract it I get : unexpected end of archive, I'm using winrar. But the problem is relevant to the app itself, more precisely the multithreaded download, (when I use one thread I don't get this error).
Well here is the relevant part of code:
//Main thread Creates worker threads and starts them
for I := 1 to ThreadCount do
begin
workerThreads[i]:=TWorkerThread.Create(URL,mapFile,PosBegin,size);
PosBegin:= PosBegin + size;
end;
//Code of worker thread:
HTTP.Request.Range := Format('%d-%d',[posBegin, posBegin + size -1]);
HTTP.Get(URL,ms);
data := MapViewOfFile(mapFile, FILE_MAP_WRITE, 0 ,0, size);
copymemory(data,ms.memory,ms.size);
What's the problem in my code?
Thanks dor your replies.
Your call to MapViewOfFile() is setting the dwFileOffsetHigh and dwFileOffsetLow parameters to 0, so every thread is writing its data to the same file offset 0, thus overwriting each other. You need to set the view's file offset to posBegin instead.
Also, if you are not already doing so, make sure you have pre-allocated the file to the total file size being downloaded by all threads, and have created a file mapping of that same size.
Related
We have successfully used C/AL code to run multiple Rapid Start packages from a Processing Only report. Now we need to reverse the process and upload the Excel files back into Dynamics NAV. The codeunit exposes 3 methods, ImportExcel, ImportExcelFromConfig and ImportExcelFromPackage. Can anyone explain which is the best method to use if we have a RapidStart.xlsx file to upload? Any code would also be appreciated as we have not been successful finding any documentation on this.
Thanks
I found this thread which after a lot of trial and error became a workable, albeit a bit kludgy solution. Error on Importing rapid start Configuration Package file in business central
We used a non-processing report to download an Excel file and then users updated it. The file was so large we had to split it into smaller pieces which would run for about 10 minutes during import. We created the files from Excel VBA but no matter what format we used on the Workbooks.SaveAs function, we would get fatal errors pertaining to Xml parsing when we tried to upload them into Nav using Rapid Start. The solution was to create an Excel template from an empty Excel file Exported from Dynamics NAV. This way the file is set up internally to be usable by Rapid Start.
The non-processing report has a NumberOfFIelds variable on the request page and that is used to create a For Next loop with a Confirm inside so the user can cancel if they so desire. The codeunit code prompts for a file each time thru the loop. It uses the following Tables and Codeunits:
Tables
Config. Package Table 8623
Confg. Setup 8627 (Temporary)
Codeunits
Config. Excel Exchange 8618
Config. Package - Import 8620
Config. Validate Package 8621
LOCAL DoImportPriceChanges()
FOR Count := 1 TO NumberOfFiles DO BEGIN
IF Count > 1 THEN BEGIN
IF NOT CONFIRM('Import next file %1 of %2?', TRUE, Count, NumberOfFiles) THEN BEGIN
BREAK;
END;
END;
ConfigPackageTable.SETRANGE("Package Code", RapidStartSalesPrices);
IF ConfigExcelExchange.ImportExcelFromPackage THEN BEGIN
UpdatedCount += 1;
ConfigPackageTable."Package Code" := RapidStartSalesPrices;
ConfigValidatePackage.RUN(ConfigPackageTable);
TempConfigSetup."Package Code" := RapidStartSalesPrices;
ConfigPackageImport.ApplyRapidStartPackage(TempConfigSetup);
END ELSE BEGIN
MESSAGE('Error importing Rapid Start package.');
END;
END;
IF UpdatedCount = NumberOfFiles THEN BEGIN
MESSAGE('%1 files were imported into Dynamics NAV.', NumberOfFiles);
END ELSE BEGIN
MESSAGE('Processing was incomplete. %1 out of %2 files were processed.', UpdatedCount, NumberOfFiles);
END;
I need to force OS to purge the pages used for a mapped file. I don't have the file descriptor, so posix_fadvise cannot be used.
Our application caches a lot of files by mapping them into memory. After the file has been mapped (i.e. we've got the pointer from mmap()), we close the file. When at some later point we have to clean the cache, we want to purge the pages in OS cache as well. That is, we want to unmap the file, and do something like posix_fadvise(POSIX_FADV_DONTNEED), but the file descriptor is not available at this point.
The flow looks like this:
//caching stage
fd = open("file");
data = mmap(fd, <mmap flags>);
close(fd);
//clean-up stage
munmap(data);
// posix_fadvise(???, POSIX_FADV_DONTNEED);
Is there a way to clear the cached pages without file descriptor?
I have thought about following two workarounds:
Keeping the files open, so that I have valid descriptors at the time of cleanup. However, there may be tens of thousands files, and keeping them all open may affect OS performance.
Keep the file path, and reopen it just to get a descriptor and call posix_fadvise(). But the question is: will the old mapped area be associated with the same file? And will fadvise() purge the cached pages in this scenario?
The second option worked. When the file is reopened later, the mapped area is associated with it, and calling posix_fadvise with new file descriptor unloads the mapped pages:
//caching stage
fd = open("file");
data = mmap(fd, <mmap flags>);
close(fd);
//clean-up stage
fd = open("file");
munmap(data);
posix_fadvise(fd, POSIX_FADV_DONTNEED);
close(fd);
How can I read an .ini file in a thread without consuming so much CPU time? The thread will run at runtime, this means while true without delay.
The code to search the value on .ini is:
var
Leitura : TIniFile;
begin
Result := False;
Leitura := TIniFile.Create('File.ini');
if Leitura.ValueExists('KEY', ValueToSearch) then Result := True;
Leitura.Free;
but since this function runs in an infinite loop, it consumes CPU time, and I need solve this question.
Instead of continuously polling the .INI file for changes you could monitor the file for changes using the FindFirstChangeNotification API and only check the value when the file has changed. Earlier Delphi versions contained a component TShellChangeNotifier in the unit ShellCtrls.pas which was a wrapper around the API function. There is furthermore an article A Directory Monitor Class For Delphi that shows how to use the ReadDirectoryChangesW Windows API function. ReadDirectoryChangesW "retrieves information that describes the changes within the specified directory". The Delphi JCL contains a component TJvChangeNotify to monitor file and directory changes, too. On Torry you can finally find a component ATFileNotification that allows to watch for file/directories changes and to fire an event when change occurs.
I have an MFC App which fires up a separate thread for downloading some files via cURL. At the start it downloads a text file with file sizes and last write times. Then it checks the files on disk and queues it for downloading if it has different values.. The problem is; the CreateFile call in the thread arbitrarily returns INVALID_HANDLE_VALUE. I always do CloseHandle() after a successful CreateFile(). The failing files are just random. Sometimes a file in the root dir, another time a file in a nested directory. The problem is not related to localization or directory/file names since sometimes all checks pass but sometimes don't. GetLastError() return 2 or 3 on occasion which are "File not found" / "Path not found" respectively.
When I put the function checking the file write times and size straight into the OnInitDialog() function, everything works. This smells like a multithreading issue but I double-checked everything from memory allocations to file handles.
The same code works in a console application also in a separate thread.
The platform is Win7 64bit.
Linking statically to the runtime and MFC.
in my case GetCurrentDirectory() returned the system32 path after some time so my code failed because of credentials. I fixed the issue by determining file paths manually (getting the exe path at the start and use it from there on...) . Make sure you are not trying to write to/read from a privileged location on disk. Check your paths.
I am working on VC++ project, in that my application process a file from input path and generates 3 output "*.DAT" files in the destination path. I will FTP these DAT file to the destination server. After FTP, I need to delete only two output .DAT files the folder. I am able to delete those files, because there one Asynchronous thread running behind the process. Since the thread is running, while deleting it says, "Cannot delete, the file is used by another person".
I need to stop that thread and delete the files. Multiple files can also be taken from the input path to process.
Please help me in resolving this issue. Its very high priority issue for me. Please help me ASAP.
I don't think this is a threading issue. Instead I think your problem is that Windows won't let you delete a file that still has open handles referencing it. Make sure that you call CloseHandle on handles to the file that you want to delete first. Also ensure that whatever mechanism you are using to perform the FTP transfer doesn't have any handles open to the file you want to delete.
I don't think that forcing the background thread down will solve your problem. You can't delete the files because you're holding an open handle to those files. You must close the handle first. Create an event object and share it between your main thread and the background thread. When the background thread is done sending the files through FTP, it should set this event. Have your main thread wait on the event before deleting the files.
Background Thread:
SendFiles();
ReleaseResources(); // (might be necessary, depending on your design)
SetEvent( hFilesSentEvent );
Main Thread:
WaitForSingleObject( hFilesSentEvent );
DeleteFiles();