Is there anyway we can detect if file writing is complete before start processing file? What i mean is, we have java program written in spring integration and it polls .zip file, as soon as it sees that file, it picks up and start processing it. But in case of large .zip files, before it writes complete file,it is picking up that file and causing an issue.
Does anyone know how to detect if file write is still in progress?
Related
I am implementing a process that needs to be started when a file was copied (it uses linux). The file is uploaded by external system. I have noticed that file is visible after the copy operation started but the copying takes a few minutes.
How can I recognize taht the copy operation has been finished?
Suppose I had a console app in the terminal I created using C language, shouting "Hello world!"
The program is called hello.exe.
I upload hello.exe to static server.
Now I can download the file by typing the following address in the chrome.
http://localhost:8080/hello.exe
Or I can get a Blob object using the http method in Nodejs.
Is there a way to run this obtained Blob object right away without making a file? And get string Hello world!
No similar topics were found.
Do I need to create and run the file and erase it right away?
I want is for the files to run and not remain on my PC.
I'm not aware of any way to run an .exe file without first putting it on disk. You would essentially need to write your own exe loader that worked from memory instead of disk and that would be no small effort.
Keep in mind that a client that just automatically runs some executable it gets from a URL like http://somedomain.com/hello.exe without any user intervention could be a very dangerous client as rogue web servers could send it any arbitrary executable that did all sorts of harm (viruses, ransom-ware, etc...).
Do I need to create and run the file and erase it right away?
Yes, erase it after the program is done running.
I want is for the files to run and not remain on my PC.
You will just have to clean it up at some point after it has run. If you have a programmatic client, it should be no big deal to put the file in an application-level temporary directory that your app can regular clean up. If this is from a browser, then he user controls where the file goes on disk and the user controls when it gets deleted - you can't manage that yourself from within a webpage.
Or I can get a Blob object using the http method in Nodejs.
You can download a binary. Not sure exactly what you're asking here.
I am working on Azure platform and use Python 3.x for data integration (ETL) activities using Azure Data Factory v2. I got a requirement to parse the message files in .txt format real time as and when they are downloaded from blob storage to Windows Virtual Machine under the path D:/MessageFiles/.
I wrote a Python script to parse the message files because it's a fixed width file and it parses all the files in the directory and generates the output. Once the files are successfully parsed, it will be moved to archive directory. This script runs well in local disk on ad-hoc mode whenever i need it.
Now, i would like to make this script run continuously in Azure so that it looks for the incoming message files in the directory D:/MessageFiles/ all the time and perform the processing as and when it sees the new files in the path.
Can someone please let me know how to do this? Should i use any stream analytics application to achieve this?
Note: I don't want to use Timer option in Python script. Instead, i am looking for an option in Azure to use Python logic only for File Parsing.
Usually, while we upload it takes files to the temp directory first and then move it to the desired directory. But I'm working on Big Data e.g. uploading thousands of files at once. So I need to upload those files directly to the desired location and as each one of them uploaded to that directory, the user must see the changes on the dashboard in real time.
Also I need to show user
If any exception has occurred while uploading e.g. if a file causing a problem in the uploading process.
There should be an option to skip that file or retry upload.
Report to show the list of files uploaded successfully vs files that failed to upload.
If there is any network outage, the upload manager should keep retrying until the network is restored.
User can pause upload and can restart it on next login(if it is feasible)
This is about full manipulation of the upload process to give user the best user experience while uploading large sets of data.
You can use ng2-file-upload, it has most of the feature you require.
You can also find demo here.
For rest of the features you require, you can implement those on top of this library (It's better than writing your own code from scratch).
I am running a web server with several CMS sites.
To be aware of hacks on my web server, I am looking for a mechanism, by the help of which I can detect changed files in my web server.
I think of a tool / script, which traverses the directory structure, builds a Checksums for each file and writes out a list of files, with file size, last modified date and checksums.
At the next execution, I would then be able to compare this list with the previous one and detect new or modified files.
Dies anyone know a script or tool, which can accomplish this?