Node.js ensure files are closed - node.js

With my js script I am watching a directory for new files and copy them to another location. Other app creates this files.
The problem is that my watch method is triggered as only that external app starts file creation and it is still not full file. Then I copy this trimmed file.
How can I ensure the file is fully created by the external app?

Related

Ignore folder or file for Node Azure Function App

I want my Node.js Azure Function App to ignore a test data file.
When I run my app, after adding the file, I see it trying to parse the file and showing the following error:
Worker was unable to load function ...
How do I tell the runtime to ignore this data file or the folder containing it?
As far as I Know,
Worker was unable to load function ...
This error comes when any packages installed outside of root node_modules folder and if those are not included in the runtime or deployment.
How do I tell the runtime to ignore this data file or the folder containing it?
.gitignore file plays major role in Azure Functions that used to ignore the specified files even they have changes.

How to recognize that file was uploaded

I am implementing a process that needs to be started when a file was copied (it uses linux). The file is uploaded by external system. I have noticed that file is visible after the copy operation started but the copying takes a few minutes.
How can I recognize taht the copy operation has been finished?

How to trigger Azure Logic App when dropping file in Sharepoint folder

I created a Logic App that uses the Sharepoint trigger "When a file is created or modified in a folder". It works perfectly when I upload a file in Sharepoint online (in a Sharepoint browser tab). But, it doesn't work when I drop a file in my synced Windows explorer folder.
I read that someone faced the same problem: https://learn.microsoft.com/en-us/answers/questions/41215/logic-app-why-does-sharepoint-file-properties-trig.html. Here it says:
Move files and flow runs When you move one or more files from one
document library to another, the original file is moved from the
source library to the destination library. Moving the file does not
alter any custom metadata, including when the file was created and
modified. Hence, this action does not trigger any flows for those file
updates associated in the library where it was moved.
Syncing files to your OneDrive for business and SharePoint document
libraries When users sync one or more files from one document library
to another, the original file is moved (synced) from your client to
the destination library. Syncing the file will not alter any custom
metadata including when the file was created and modified. Hence, this
action will not trigger any flows for those file syncs in that library
or in your OneDrive for business.
The thing is that I NEED this Logic App to run by just dropping this file in a Windows Explorer folder (which is a Sharepoint folder shared with a certain person). Do you know how can I achieve this?
It started working for me when I used the OneDrive - When a file is created Connector because we use OneDrive for Windows Explorer and need to include the folder where the trigger should be invoked. We must set Include subfolder to true if we want the trigger to be fired while adding any file to the subfolders.
Here are the screenshots of the logic app working
When adding file in subfolders
When adding file in root folder

Azure Windows App Service files not available across nodes

Our application has the ability to request a generation of a file that are then downloaded by the client. We are seeing issues that when our app service has more than one node that the files generated are not available across the other nodes.
E.g.:
POST request to generate file and save to d:\home\site\wwwroot\app_data is send by user from machine-1 and is successful.
GET request from user to download this file is received by machine-2, this fails because the file cannot be found.
My reading of the microsoft docs is that anything in d:\home is backed by azure storage and is not local to the machine: https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#file-access
File access across multiple instances The home directory contains an
app's content, and application code can write to it. If an app runs on
multiple instances, the home directory is shared among all instances
so that all instances see the same directory. So, for example, if an
app saves uploaded files to the home directory, those files are
immediately available to all instances.
But this doesn't seem to be happening, is there something else that needs configuring?

Store file in a local drive folder using electron or node.js

I create one software in electron. I need to add image upload functionality into it. Uploaded images will store in a local drive(D:// Drive) folder and also on preview time I need to access the same folder so I have not idea it is possible in electron and if yes then how can I do this?
And for an extra thing in the backend, I use nodejs if you have an idea about that thing is possible using node js then also tell me so I can integrate with node js also. just I need to store that images to a particular folder which is in my local drive and I also access that image from my local drive for preview things.
For selecting file to upload trigger event (ex. upload-start) through ipc from render process.
In main process in handler for this event use dialog module and .showOpenDialog() method which will return a path to the file.
Then in main process you can use fs module to work with file: read it, copy, move, rename and write.
And check How do I handle local file uploads in electron?

Resources