I have searched for an definite answer online and on stackoverflow, but I have not found a clear step-by-step way to handle uploading large files (50MB+) to a wagtail CMS website.
My setup is nginx, gunicorn, postgresql on a ubuntu server.
When trying to upload a large file from the "documents" section of the admin (e.g: /admin/documents/multiple/add/), the progressbar moves as when uploading a file as normal, but then it has an error in the admin: "Sorry, upload failed."
I am basically having the same problem as this question, only without that specific error message.
I have set client_max_body_size to 100000M (nginx) and the MAX_UPLOAD_SIZE setting (wagtail/django) to a large amount as well.
How can I resolve the issue and successfully upload my large files (.zip and .xyz) to the wagtail website? Any help and/or suggestions are appreciated. Thanks.
Related
I am currently working on a web application where I need to store large files (mp4 videos that sometimes have a size greater than 100mb). But, when I am trying to upload them from a static Angular website hosted in a S3 bucket to an API hosted with AWS Elastic Beanstalk, I got a error that I don't understand.
Click here to see the error
What I tried:
There is no problem when uploading PDF. It works perfectly.
There is no problem when uploading very short MP4 (3s, 453Kb). It works clean, but a little bit slower than PDF, but still really short (3 seconds). This is why I think the problem could came from the file size.
I read on Internet that there's something called client_max_body_size when using Nginx (as AWS does). I tried to increase this default limit by adding this to my project:
myrootproject/.ebextensions/nginx.conf
Into nginx.conf:
files:
"/etc/nginx/conf.d/proxy.conf" :
content: |
client_max_body_size 4G;
but nothing has changed... or at least it didn't have the desired effect, it still not working.
Additional informations
When I do this manipulation in local, it works fine.
When I do this manipulation from the hosted website (S3 bucket) to localhosted API, it works fine.
It takes really long to have a response from the server (only when this error occurs). I have the feeling that the request don't even access my NodeJS code, because if an error is emit on it, I would handled it.
Here is screenshots of my request, if it could help:
Request (first part)
Request (second part)
I really need help on this one, hoping you can give it to me!
P.S: I created this post with help of a translator. If some parts are strangely written, my apologies,
The option myrootproject/.ebextensions/nginx.conf does not take effect, probably because you are using Amazon Linux 2 (AL2) - I assume that you are using AL2. But this config file works only in AL1.
For AL2, the nginx settings should be provided in .platform/nginx/conf.d/, not .ebextentions. Therefore, for example, you could create the following config file:
.platform/nginx/conf.d/mynginx.conf
with the content of:
client_max_body_size 4G
I have built a Node/Express server behind Nginx & a javascript webapp to upload files. The webapp uses multer for uploads. I have adjusted Nginx to accept up to 10GB files.
When I attempt to send a 4GB file, Chrome craps itself with a "stalled" message in the Timing tab. There is no message about the error at /var/logs/nginx/error.log
Is there some practical limit to file upload using multer (multer-s3)? What strategy can I use for uploading large files like this? I have read about 'chunking' in answers from 2012, but surely this is a thoroughly solved problem.
Thank you
I have a node.js with express running on Heroku linked to a github repository, it is serving a website which also contains a "gallery" section.
The pictures in the gallery are loaded in very high res by other non tech-savvy admins, to prevent huge data usage from mobile users.
I would like the express.js server to downscale and compress the images coming from a certain path when requested by a normal get request before sending them as reply.
Could you help me understand how can i "intercept" those requests ? or at least route me in a certain direction ?
Sorry to ask it here and like this, but i tryed looking many wikis and some questions here on stackoverflow but none seems to talk about what i'm searching
(at least from my understanding).
Thank you for your time!
Sorry, It might be very novice problem but I am new to node and web apps and just have been stuck on this for couples of days.
I have been working with a API called "Face++" that requires user to upload images to detect faces. So basically users needed to upload images to my webapps backend and my backend would do an API request with that image. I somehow managed to upload the files at my node's backend using tutorial provided below but now I am struggling how to use those image files. I really don't know how to have access to those files. I thought writing just the filepath/filename would help but it did not. I am really new at webapps.
I used tutorial from here: https://coligo.io/building-ajax-file-uploader-with-node/
to upload my files at back-end.
thanks
You can also use the Face++ REST API node client
https://www.npmjs.com/package/faceppsdk
As per in documentation it requires a live URL on web. Then you have to upload your files into remote location (You may upload files to a Amazon S3 Bucket)
And also you check the sample codes from Documentation where you can upload directly to Face++
In my Lotus Notes web application, I have file upload functionality. Here I want to validate the attachment file size before uploading which I did through webquerysave. My problem is that whenever the attached file size exceeds the limitation, which is configured in server document, it throws the server error page like “HTTP: 500 Invalid POST Request Exception”.
I tried some methods to resolve this, but they’re not working:
In domcfg.nsf, I mapped the target form called "CustomGeneralErrorForm".
I created "$$ReturnGeneralError" from to show error page.
In Notes.ini, I added "HTTPMultiErrorPage=/error.html"
How can I resolve this issue?
I suppose there's no way. I've tried several time to catch that error but I think the only way is to test files size with javascript; Obviously it works only with html5 browsers as you can find in this post:
Using jQuery, Restricting File Size Before Uploading
So... you have to write code to detect browser features and use javascript code with html5 browser and find alternative ways for old browser.
For example you can use Flash plugin and post tu server-side code depending on your backend.
Uploadify is a very good chance (http://www.uploadify.com/) to work just one time, but make a internet search and choose the best for you.
In this way you can stop user large posts, but if you need to upload large size file (>10Mb default) you must set a secondary internet site server document with greater post size limit.