When I convert an xml document to fluid DDS using SharedMap and SharedObjectSequence and set it in the fluid container I get the error 413 (Payload Too Large). Error response:
{"message":"request entity too large","expected":109452,"length":109452,"limit":102400,"type":"entity.too.large"}
I am trying this in https://github.com/microsoft/FluidHelloWorld. I am using tinylicious and localhost. It works fine for small xml files. I did a quick search through the code and didn't find where this is enforced.
Is it possible to increase this limit?
You're running up against a max request size in the web server rather than a Fluid Framework issue.
If you want to enable Tinylicious to handle larger request sizes, you would clone the Fluid Framework repository and modify the configuration of the Express service.
So go to tinylicious/src/app.ts and add app.use(express.bodyParser({limit: '50mb'})); This will raise the request limit. To use your modified Tinylicious, you'd compile and run the service locally.
Alternatively, you can break up the XML and set parts of the XML into the keys of the DDSs.
Without knowing your scenario, I'd lean towards the solution of breaking up the XML because it'll lead to lower latency updates. For instance, as you read in the XML, set the tag pairs in the Shared Object Sequence immediately, then keep reading and continue for the children objects.
You may want to open an issue on the repo because the max request body size should cause a clearer error. The service should also accept up to the max Kafka message size, which is the true limiting factor.
Related
The platform I'm working on involves the client(ReactJS) and server(NodeJs, Express), of course.
The major feature of this platform involves users uploading images constantly.
Everything has been setup successfully using multer to receive images in formdata on my api server and now its time to create an "image management system".
The main problem I'll be tackling is the unpredictable file size of users. The files are images and the depend on the OS of the user i.e user taking pictures, users taking screenshots.
The first solution is to determine the max file size, and to transport it using a compression algorithm to the api server. When the backend receives it successfully, images are uploaded to a CDN (Cloudinary) and then the link is stored in the database along with other related records.
The second which im strongly leaning towards is shifting this "upload to CDN" system to the client side and make the client connect to cloudinary directly and after grab the secure link and insert into the JSON that would be sent to the server.
This eliminates the problem of grappling with file sizes which is progress, but I'll like to know if it is a good practice.
Restricting the file size is possible when using the Cloudinary Upload Widget for client-side uploads.
You can include the 'maxFileSize' parameter when calling the widget, setting its value to 500000 (value should be provided in bytes).
https://cloudinary.com/documentation/upload_widget
If the client will try and upload a larger file he/she will get back an error stating the max file size is exceeded and the upload will fail.
Alternatively, you can choose to limit the dimensions of the image, so if exceeded, instead of failing the upload with an error, it will automatically scale down to the given dimensions while retaining aspect ratio and the upload request will succeed.
However, using this method doesn't guarantee the uploaded file will be below a certain desired size (e.g. 500kb) as each image is different and one image that is scaled-down to given dimensions can result in a file size that is smaller than your threshold, while another image may slightly exceed it.
This can be achieved using the limit cropping method as part of an incoming transformation.
https://cloudinary.com/documentation/image_transformations#limit
https://cloudinary.com/documentation/upload_images#incoming_transformations
I am working on a node web application and require a form to enable users to provide a URL containing a (potentially 100mb) large CSV or XML file. This would then be submitted and trigger the server (Express) to download the file using fetch, process it and then save it to my Postgres database.
The problem I am having is the size of the file. Responses from the API take minutes to return and I'm worried this solution is not optimal for a production application. I've also seen that many servers (including cloud based ones) have response size limits on them, which would obviously be exceeded here.
Is there a better way to do this than simply via a fetch request?
Thanks
I am working on web api in node.js and express and I want to enable users to upload images.
My api is using JSON requests and responses but when it comes to uploading images I don't know which option is a better one. I can think of two ideas:
encode images as a base64 strings and send them as a JSON (like {"image": "base64_encoded_image"})
use multipart/form request and handle the request with a help of packages like multer
I've been reading some articles and other questions related to my issue and I'm still struggling to choose one approach over the other. Encoding image and sending it with JSON increases the size of data by about 25% (that's what I've read) but using multipart seems weird to me as all other endpoints on my api use JSON.
The multipart/formdata approach has certain advantages over the Base64 encoding one.
First and foremost disadvantage of using Base64 approach is the 30% increase in size of data, while this may not be significant for files of small size but it will definitely matter if you are sending large files and storing them on storage spaces( will increase your cost/data-consumption ). Also packages like multer provide you with certain functionalities like - checking the type of file(jpg,png etc) and set size limit on files etc. And they are quite easy to implement as well with a lot of tutorials and guides present online.
Furthermore, converting image to Base64 string increases computation overhead on user's machine especially if the file is large.
I would advise you to use multipart/form-data approach for your case.
In my Lotus Notes web application, I have file upload functionality. Here I want to validate the attachment file size before uploading which I did through webquerysave. My problem is that whenever the attached file size exceeds the limitation, which is configured in server document, it throws the server error page like “HTTP: 500 Invalid POST Request Exception”.
I tried some methods to resolve this, but they’re not working:
In domcfg.nsf, I mapped the target form called "CustomGeneralErrorForm".
I created "$$ReturnGeneralError" from to show error page.
In Notes.ini, I added "HTTPMultiErrorPage=/error.html"
How can I resolve this issue?
I suppose there's no way. I've tried several time to catch that error but I think the only way is to test files size with javascript; Obviously it works only with html5 browsers as you can find in this post:
Using jQuery, Restricting File Size Before Uploading
So... you have to write code to detect browser features and use javascript code with html5 browser and find alternative ways for old browser.
For example you can use Flash plugin and post tu server-side code depending on your backend.
Uploadify is a very good chance (http://www.uploadify.com/) to work just one time, but make a internet search and choose the best for you.
In this way you can stop user large posts, but if you need to upload large size file (>10Mb default) you must set a secondary internet site server document with greater post size limit.
I have an image upload view on my client (ember.js) that send the resized image to nodejs rest api;
it works well but it is easy for someone expert to force upload of a non-resized image;
I would like to keep the resize process on the client because this allows users to select heavy-weight images, that are resized locally and uploaded only after that, when they are lightweight;
If someone else uses something like this, I'm interested on how it is possible to make this as safe as possible;
As a rule of thumb when developing web applications is never ever trust any data coming from the client side, always try to do a check in your server side!
Use authentication, this ensures that user only allow to upload data to their own account and not fiddling others files.
Add a special message passing between your server and client, a simple example would be
i. send a post API request first (that contains the image information and targeted compressed size) to your server indicating that your client is starting to compress the picture
ii. when uploading, add a metadata to include the complete compressed image, and check the uploaded image with your server if it is within the accepted threshold, else discard it
You could enhance the security of the message passing to be more complicated!
This would be my simple security, anyone else got better solution? :)
Approaches here also work for file uploads. You can use a combination of checking:
content-length header and/or (i.e. req.headers['content-length'] > x)
reading stream size as it's being read by server. (i.e req.on('data'))
If the stream data exceeds a certain size you can respond accordingly. Check out something like Multer for file uploads, specifically the limits section. Best approach would probably the second option.