uploading and processing file progress - web

I am working with Yii framework. My goal is to upload an xls file, parse it and load information into data base.
The problem is that the file is too big and it is required to have a progress bar corresponding to the process. I am not a very experienced developer and the task is tricky.
How can i accomplish that?

wierd that nobody advised anything...
I have done the following:
save the uploaded file and find the maximum row
redirect to another page with progress bar
send the starting row and a chunk size from the frontend
load the chunk into database (phpexcel allows to read a chunk from file);
get responce in front end and increase the progress bar.
Not that fast and informative method, but at least i didn't have to get hard with sockets

Related

Node JS - How to view progress when using Archiver (zip)

I already searched on the Web and I finded solutions using "entries".
But this solution is not fit when you have (for example) a very big file in folder containing may files.
The progress seams to be stopped while processing of the big file.
I really want the user to see the current processing.
So I thought to another solution.
Display the size of the compress file during the processing.
I know it's possible to use "createReadStream" to follow a file upload.
But how to follow the size of compress file....?

p:remotecommand to upload files

I would like to upload file programatically to my jsf application. The user should select a directory on his system, and a js script should loop on any file in dir and send each one to the listener serverside
I cannot use FileUpload, because it cannot select a whole dir with thousands of file, so I was thinking to use jquery and send the file to a remotecommand, but I have no clue to how send the file itself (normally I pass just string)
so I was thinking to use jquery and send the file to a remotecommand, but I have no clue to how send the file itself
Don't go there. It is a bad attempt to a workaround for a bad design choice. You'd most likely run into similar problems and what about the user selecting a lot of files for second time if it fails halfway? It might become slow, might run into browser limits (search for uploading multiple files in plain html)...
If you still want to do it via a webbrowser (which according to one of your other questions you do not want to), maybe try something like https://webdeltasync.github.io/ (disclaimer: I did not use it myself, and there might be similar ones (https://www.google.com/search?q=browser+based+rsync), it is just a hint in which direction to find a real solution)

loading AND saving to txt/csv file?

I am trying to set up tabulator with all it's data validation goodness and simple to use UI in order to help a colleague with CRUD operations on a .txt file he has to do on a daily basis.
I found that tabulator can load data using AJAX but my question is, is it possible to load the data from a .csv/.txt file and then save to the same file?
I know you can export to .csv but without overwriting the loaded data, next time all his work would be lost.
If you are referring to a file on a users local computer then im afraid there is no import from file functionality built in to tabulator, but there is nothing to stop you implementing that bit your self.
The link below is a link to an article that explains how to load a CSV file from an input element in JavaScript,. In the example it loads it into an HTML table, but you could easily alter that to dump it into an array of objects to pass into Tabulators setData function
http://codeanalyze.com/Articles/Details/20174/Read-CSV-file-at-client-side-and-display-on-html-table-using-jquery-and-html5
In terms of saving the data back to the users computer, you would need to use the built in download function, there is no way to save it back to the users computer without the file popup due to browser safety constraints.
But i will add that the above approach is a bit unorthodox. The usual way to handle data persistence would be to save the data back to your server into a database, and then load it back to the client with an ajax request, giving the user the option to download the data when they want the final copy

Download multiple PNGs with one request/response (nodejs)

I have one specific problem:
I have a single request, sent to an nodejs express application, which shall return multiple PNGs. The result is displayed directly in an image in the browser.
image.src = http://www.....
The PNGs are created server side, but I do not manage to send them back in one response and display them client side in the browser as images.
I piped the PNG content in a stream, which I sent back. I closed the stream at the end, when all files were sent back. I did not call res.end() in between.
What happens is that I receive a PNG File, really big (so I know it contains the data of all the images I requested), but my image in the browser only displays the first PNG - and I dont know how to split the response client side to view all my images.
I even can't view the PNGs if I download them directly on my pc (by calling the request manually) - the returning PNG is really big, but does only display the first PNG when I open it for example in gwenview.
It is no option to send the pngs in an zip file and extract it client side, because not all browser do support the unzipping. (I had a hard time learning this) I really dont know how to solve this problem. Do I need to set a specific header type I do not know yet? Do I need to send some delimiter inbetween the PNGs? How can I split the returning PNG client side?
Some help would be really appreciated!
Thank you!
There's no MIME type for multiple images. Your best bet is to generate a single html page and then embed the images in it. If you don't want to keep any of the images on your server, you can embed them using a data URL: https://github.com/heldr/datauri
An alternative option is use node-canvas to combine multiple images into one large one.

node.js read images from PDF

I need to use PDF in a way similar to ZIP/RAR. To hold many images (ancient tibetan buddist literature), ideally 60000. But splitting in 10-100 volumes is OK.
Anything can be used for packing, but for unpacking we need Node.js. Because same PDF file must be served on web. But some users will need to use whole PDF.
So the question is, what node module I can use to read any single arbitrary image from huge PDF? Example would really help.
Every image is a single page. (Or in otherwords every page is single image)
We have been using https://github.com/mirkokiefer/Node-Magick for this....
But the pngs we get out sometimes are fairly low quality..

Resources