I would like to know if there is an JavaScript API for Excel to tell the size of an Excel file.
Could anyone help?
No. The browser (including JavaScript) don't access files on the clients computer i.e. Excel file. Upload the file to the back-end, and allow the server to analyse the file, and send result back to client/browser (JavaScript)
Related
To render on threejs, we need some images(jpg/png) and , jsons(uv data). All these files are stored in respective folders and the files visible for clients to look at.
I use django/python to start a local server, python code is compiled to .pyc & js code is obfuscated. But the folder structure is accessible for Casual Users. In threejs, we use tex_loader and json_loader functions to which the file paths are given as inputs. Was looking at ways of securing the behind the scenes work.
Happened to read about custom binary formats, but that felt like a lot of work.
or giving access to files only for certain process starting through django/web browser?
Are there any available easy to deploy solutions to protect our IP ?
An option would be to only serve the files to authenticated users. This could be achieved by having an endpoint on your backend like:
api/assets/data.json
and the controller in the backend would receive the file name(data.json), the code could check if the user requesting the endpoint is authenticated and if so read the file from the file system(my-private-folder/assets/data.json) and return it as file with correct mime-type to the browser.
I have a nodejs backend and I want to send a file download link to the client such that, the file is directly accessible by the client. The file types are JPEG and PNG. Currently, I am serving these files as data-uri but, due to a change in requirements, I must send a download link in response to the file request and, client can download the file later using that link.
Now the current workflow exposes a path /getAvatar. This path should send a response back to the client with the file link. The file, is stored in /assets/avatars relative to the server root. I know I can express.static middleware to send back static resources. However, the methods I have seen so far, res.send() and res.download() both tries to send the file as attachment rather a link that can be used later to download.
Basically, the behavior is like a regular file sharing site where, once a file is clicked, a link to it is generated which, is used for downloading the file. How can I do this?
I have a shell script written, which generates an excel file of more than 100MB. Now, I want to transfer the file or say upload the file to one URL which is online storage server. This server will generate a URL containing the file after uploading.
The question is if we are able to upload the file using "cURL" to any given URL, then how to get the generated URL from that web page??
The URL generated after uploading the file is dynamic. (Dropbox kind of storage)
If it is not possible to get that URL then how to transfer such a big file.
Note: It is kind of automation, so please answer keeping automation in mind.
Thank you in advance.
I want to export a table to an Excel file. I need to export a report.
ORA_EXCEL.new_document;
ORA_EXCEL.add_sheet('Sheet name');
ORA_EXCEL.query_to_sheet('select * from mytable');
ORA_EXCEL.save_to_blob(myblob);
I saved my table to blob.
How do I export/respond to the user (client)?
I need something that is simple to allow a user to be able to download an Excel file to their own computer. I tried doing this procedure in an Oracle workflow:
ORA_EXCEL.save_to_file('EXPORT_DIR', 'example.xlsx');
But this did not help, because it is saves the file to a directory on the server and I need it in the real server.
The way I have handled similar issues in the past was to work with the systems people to mount a directory from either a web server or file server on the Database server.
Then create a directory object so that the procedure can save to a location that is accessible to the user.
If the files are not sensitive and there are a limited number of users then a file server makes sense as it is then just a matter of giving the user access to the file share.
If files are sensitive or this is a large number or unknown users we then used the Web server and sent a email with a link to the user enabling them to download their file. Naturally there needs to be security built into this to stop people being able to download other users files.
We didn't just email the files as an attachment because...
1) Emails with attachments tend to get blocked
2) We always advise not to open attachments on emails. (Yes I know we advise not to click on links as well but nothing is perfect)
Who or what is invoking the production of the document?
If it´s done by an application, which the user is working on, this application can fetch the BLOB, stores it at f.e. TEMP-Directory and calls
System.Diagnostics.Process.Start("..."); to open it with the associated application. (see Open file with associated application)
If it´s a website, this one could stream the blob back as Excel-Mimetype (see Setting mime type for excel document)
Also you could store in an Oracle-DIRECTORY, but this one has to be on the server and should be a netword-share to be accessible for clients (which is rarely accepted in a productive environment!)
If MAIL isn´t the solution, then maybe FTP can be a way to store files in a common share. See UTL_TCP - Package, with this a FTP-transfer can be achieved (a bit hard to code, but there are solutions to find in the web) and I guess, professional tools that generate Office-documents out of Oracle-DB and distribute them do it like this.
In my Lotus Notes web application, I have file upload functionality. Here I want to validate the attachment file size before uploading which I did through webquerysave. My problem is that whenever the attached file size exceeds the limitation, which is configured in server document, it throws the server error page like “HTTP: 500 Invalid POST Request Exception”.
I tried some methods to resolve this, but they’re not working:
In domcfg.nsf, I mapped the target form called "CustomGeneralErrorForm".
I created "$$ReturnGeneralError" from to show error page.
In Notes.ini, I added "HTTPMultiErrorPage=/error.html"
How can I resolve this issue?
I suppose there's no way. I've tried several time to catch that error but I think the only way is to test files size with javascript; Obviously it works only with html5 browsers as you can find in this post:
Using jQuery, Restricting File Size Before Uploading
So... you have to write code to detect browser features and use javascript code with html5 browser and find alternative ways for old browser.
For example you can use Flash plugin and post tu server-side code depending on your backend.
Uploadify is a very good chance (http://www.uploadify.com/) to work just one time, but make a internet search and choose the best for you.
In this way you can stop user large posts, but if you need to upload large size file (>10Mb default) you must set a secondary internet site server document with greater post size limit.