How to develop a real time file upload with Angular 2 and Node.js? - node.js

Usually, while we upload it takes files to the temp directory first and then move it to the desired directory. But I'm working on Big Data e.g. uploading thousands of files at once. So I need to upload those files directly to the desired location and as each one of them uploaded to that directory, the user must see the changes on the dashboard in real time.
Also I need to show user
If any exception has occurred while uploading e.g. if a file causing a problem in the uploading process.
There should be an option to skip that file or retry upload.
Report to show the list of files uploaded successfully vs files that failed to upload.
If there is any network outage, the upload manager should keep retrying until the network is restored.
User can pause upload and can restart it on next login(if it is feasible)
This is about full manipulation of the upload process to give user the best user experience while uploading large sets of data.

You can use ng2-file-upload, it has most of the feature you require.
You can also find demo here.
For rest of the features you require, you can implement those on top of this library (It's better than writing your own code from scratch).

Related

I wonder if its possible to store the image upload to server store outside jar, and how to find the direction

I created a spring-boot web project and uploaded it to server already(centOS7).
currently the img upload to jar file on server is stored inside the static package in jar file
this makes the jar file very large and hard to edit.
can some one give me a idea to store the img somewhere else on server and how to find the position of picture out of jar inside html.
First of all, you have to decide, in which directory you are going to store your files and create it:
mkdir /path/to/your/dir
Then assign a newly created directory to your application user:chown <your user>:<users group> /path/to/your/dir
Then, don't forget to give read/write permissions for the user, under which you run your app - to the already created directory.
chmod 600 /path/to/your/dir - this will allow your app to only read/write to the directory and prevent the execution of files within it (for security reasons).
Then just replace the path you already have - with the new one (to the newly created directory).
Please, be aware that there is a lot of security stuff to consider when you're going to store files on your server.
By the way, please consider reading about different storage options like AWS S3, Google Cloud Storage and Ceph.
Please, take into account - that if you're going to store your files on your server - then you should take care about them (for example: keep eyes on space, make sure you have mirroring across discs and so on so forth). With AWS S3, for example, you don't need to care about all of that stuff and it's very cheap.

Upload of large company snapshot results in error "the file exceeds the maximal allowed size (1048576 KB)"

Trying to upload a large Acumatica company snapshot file (1.3 GB) and I am getting an error as soon as I hit the upload button.
What setting (if any) can I change in my local Acumatica site or web.config to allow the large file import?
As a work around I am requesting a snapshot file without file attachments as the file attachments data is about 95% of the snapshot file size.
My file upload preferences are currently set to 25000 KB if that helps any. (I assume this setting is not used for snapshot imports.)
The error occurs after I select the file and click ok (before being able to click the upload button). I am using 2017R2 Update 4.
Image of error:
Modifying your web.config might work, but I think Sergey Marenich alternative is better. He did an excellent post on his blog on how to do this.
http://asiablog.acumatica.com/2017/12/restore-large-snapshot.html
The idea is :
Get a snapshot of your site in xml
Extract and put the folder in C:\Program Files (x86)\Acumatica ERP\Database\Data
Use the Configuration Wizard to deploy a site and select your snapshot data, just like you would when choosing demo data.
If your on SaaS then you may request a copy of database and be able to restore the database for offsite instance.
If your on PCS/PCP then you have couple of options you could modify the Web.config to allow bigger files to process as detailed in this blog https://acumaticaclouderp.blogspot.com/2017/12/acumatica-snapshots-uploading-and.html
If you have larger files then you can't do it coz of IIS constraint and you can certainly use Sergey's method but that would be creating for new instance only or simple approach is to take a SQL .bak file and restore to new database.
I think Acumatica shld provide a mechanism to split these large files and have them processed into multiple uploads to accomplish but again very few customers might face this issue too.
I had this same problem. I tried to modify the web.config but, that gave me an error that said that the file didn't exist or I didn't have permissions when I tried to import the snapshot file into Acumatica again.
Turns out, I had a table that has image blobs stored inside of it so, it wouldn't compress. Watch out for that one.

Setting up a trigger to watch new folders Azure Logic Apps

I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)

Google app engine file system returns unactual files after requesting it

I have a web app deployed in google app engine.
And in one directory called "csv" there're csv files named with unique IDs. But when I try to download these files they're sometimes unavailable (about 50% of attemps).
I tried to check this "csv" folder and see how it changes after refresh. Approximately 50% of refreshes it shows all existing files and another 50% times it shows all files without the last added file. You can see the two example refreshes on screenshots:
Why does this happen? What shall I do to make it always display all existing files?

Transfer zip file to web server

I would like to develop an app that targets everything from Gingerbread(version 2.3 API 9) to JellyBean(version 4.3 API 18).
The problem:
I need to transfer large images(40 to 50 at a time) either independently or in a zip file without the user having to click on each file being transferred. As far as I can tell I need to use the HttpClient(org.apache) that was deprecated after JellyBean.
Right now the application takes the images and zips them to a zip file prior to uploading. I can create additional zip files, for example if I have 50MB to transfer I can make each zip file about 10MB and have 5 files to be transferred if I have to. I need to transfer these files to a web server. I cant seem to find anything about transferring files after Jellybean. All the searching I've done uses the deprecated commands and the posts are 2-5 years old. I have installed andftp and transferred a 16MB zip file last night that was created by my app, but I really don't want to use that as it will require additional steps from the user. I will try andftp today and setup an intent to transfer the files to see how that works out. Supposedly andftp works until Lollipop(5.0). If there is an easier way please let me know, hopefully I've missed something about transferring files. Is there another way to do this after JellyBean?

Resources