Request Entity Too Large Liferay Tomcat - liferay

I am getting the following error in Liferay(tomcat server) while uploading documents.
Request Entity Too Large
The requested resource
/MyPortal/group/control_panel/manage
does not allow request data with POST requests, or the amount of data provided in the request exceeds the capacity limit.
My code is as follows.
FileEntry Dlfile = DLAppServiceUtil.addFileEntry(
themeDisplay.getScopeGroupId(), folderId,
FilenameUtils.getName(uploadedFile.getFileName()), uploadedFile.getContentType(),
uploadedFile.getFileName(), "", "", uploadedFile.getInputstream(),
uploadedFile.getSize(), serviceContextFile);
Please help.
Thanks.

please check the size of the document you are uploading
Your can use these two properties to raise the max size
com.liferay.portal.upload.UploadServletRequestImpl.max.size
dl.file.max.size

Related

How to get the total count of records for a search endpoint for an oracle r2dbc to implement pagination

Iam implementing a search endpoint for a particular condition using oracle r2dbc to provide a paged output. I need total number of records so that the user can fetch the results accordingly. Below is the code that am using. However it fails the performance test and throws up 503 error. Any help will be appreciated.
{
return responseMapper.map(planogramRepository.findPlanogramsByDepartmentAndSubgroup(filter.getDepartment_id(),filter.getSubgroup(),pageRequest))
.collectList()
.zipWith(this.planogramRepository.countPlanogramsByDepartmentAndSubgroup(filter.getDepartment_id(),filter.getSubgroup()))
.flatMap(tuple2 -> Mono.just(SearchResponse.builder().totalCount(tuple2.getT2()).planograms(tuple2.getT1()).build()));
}

Handling of etags in batch request using SAP Cloud SDK

I am trying to carry out a batch request including a create, update and a delete (all are different salesorders). As per this question here which deals with something similar, I have done a get for the items I want to update and delete before I add them to the batch request. I am using the SalesOrder.builder() to prepare the SalesOrder I want to create.
final ErpHttpDestination destination = DestinationAccessor.getDestination(DESTINATION_NAME)
.asHttp().decorate(DefaultErpHttpDestination::new);
final SalesOrderItem salesOrderItem1 = SalesOrderItem.builder().material(material)
.requestedQuantityUnit(requestedQuantityUnit).build();
final SalesOrder salesOrder1 = SalesOrder.builder().distributionChannel(distributionChannel)
.salesOrderType(salesOrderType).salesOrganization(salesOrganization)
.organizationDivision(organizationDivision).soldToParty(soldToParty)
.item(salesOrderItem1).build();
final SalesOrder orderToUpdate = new GetSingleSalesOrderCommand(orderToUpdateID, destination,
new DefaultSalesOrderService()).execute();
orderToUpdate.setSoldToParty(updateSoldToParty);
final SalesOrder orderToDelete = new GetSingleSalesOrderCommand(orderToDeleteID, destination,
new DefaultSalesOrderService()).execute();
SalesOrderServiceBatch service = new DefaultSalesOrderServiceBatch(
new DefaultSalesOrderService());
BatchResponse bRes = service.beginChangeSet().createSalesOrder(salesOrder1).updateSalesOrder(orderToUpdate)
.deleteSalesOrder(orderToDelete).endChangeSet().execute(destination);
I am then logging the BatchResponse and see I am getting a Batch Response Failure:
eTag handling not supported for http method 'POST'
I have searched for this error but can't find any resolution to it. Any ideas?
Thanks.
UPDATE: Increasing the logging to DEBUG I can see the batch request that is being sent and can see that there is an if-match header being added to the create request, which doesn't make sense as it can't match something that doesn't exist yet.
"msg":"--batch_123\r\nContent-Type: multipart/mixed;
boundary=changeset_(changeset number)\r\n\r\n--
changeset_(changeset number)\r\nContent-Type:
application/http\r\nContent-Transfer-Encoding: binary\r\n\r\nPOST
/sap/opu/odata/sap/API_SALES_ORDER_SRV/A_SalesOrder HTTP/1.1\r\nContent-
Length:
193\r\nIf-Match: W/\"datetimeoffset'2020-05-
01T11%3A51%3A16.8631720Z'\"\r\nAccept:
application/json;odata=verbose\r\nContent-Type:......
The I get the error:
Inner Error:
"msg":"batch
responseFailure(com.sap.cloud.sdk.odatav2.connectivity.ODataException:
null: <?xml version=\"1.0\" encoding=\"utf-8\"?><error
xmlns=\"http://schemas.microsoft.com/ado/2007/08/dataservices/metadata\">
<code>/IWFND/CM_MGW/537</code><message xml:lang=\"en\">eTag handling not
supported for http method 'POST'</message><innererror>...
However, what does work is if I wrap each request in its own changeset e.g.
service
.beginChangeSet().createSalesOrder(order).endChangeSet()
.beginChangeSet().updateSalesOrder(orderToUpdate).endChangeSet()
.beginChangeSet().deleteSalesOrder(orderToDelete).endChangeSet()
.execute(destination);
Edit:
This is fixed as of version 3.25.0.
Initial Answer:
This seems to be a bug. I was able to reproduce this with a different service and the behaviour is the same: The if-match header is incorrectly applied to the POST operation as well.
When debugging it seems like the request is build up correctly with the header only being present on update and delete. However, it seems that when the batch request is serialised to JSON it gets added to all requests.
So until this is fixed the workaround is isolating these operations via change sets, as you already pointed out.
Looks like eTag handling is not supported for your endpoint.
Now you can do the following to omit eTag headers:
orderToUpdate.setVersionIdentifier(null);
orderToDelete.setVersionIdentifier(null);
However I'm not sure how 'POST' fits the error description, because update uses PATCH and delete uses DELETE. The only POST that I expect would be coming from create. But we do not add headers for entity version identifiers (eTag) in OData create operation. If the same error still comes up, please try again without running createSalesOrder(salesOrder1).

Error uploading CSV with more than 200 entities to DialogFlow entity?

I am trying to upload CSV formatted entities as bulk to DialogFlow entity interface.
I am getting the following error:
Errors in 'drug' entity: The number of synonyms is greater than 200.
Please advise how to upload large amount of entities to DialogFlow?
Solved!
The way to do this is to upload it in the following format:
"item1","item1"
"item2","item2"
"item3","item3"
"item4","item4"
"item5","item5"
"item6","item6"
"item7","item7"
"item8","item8"

SuiteFlow||SuiteScript: Send email based on file size

We have a Suiteflow that sends an email as an attachment. However, the email doesn't send if the attachment is over 5 MB in size. I want to add a condition to the action that says when document size is < 5 MB. I planned on then adding a separate action to send the email without the attachment if the file size is >= 5 MB. Is this possible and if not, what work around is there?
SuiteScript (Javascript) is certainly an option but I would prefer just modifying the existing SuiteFlow
::In Response to below comments:
The email attachment is added to a Document field on the Transaction, not in the File subtab. I cannot find how to get at it's properties (like size) therefore in a search. (idea #2 below).
Also, the code sample (idea #1) does not work because nlapiLoadFile will not load a file > 5 mb, meaning I can't do a test etc. I am trying to avoid writing the whole thing as a script.
So far the only solution (and I don't feel it is a good one) is to take the sending of the email, make it into a script, and do a try catch on it. Any other ideas??
Assuming that the file that you are sending as an attachment is a file that already exists in Netsuite file cabinet you can add a script to validate the size of the attachment:
var load = nlapiLoadFile('100');//where 100 is the internal id of the file
var filesize = load.getSize(); //Returns the size of the file in bytes
if(filesize > .....) //
For reference of using this Suitescript API:
Helpguide > SuiteCloud (Customization, Scripting, and Web Services) : SuiteScript : SuiteScript API : SuiteScript Objects : nlobjFile
I konw its not efficient but how about doing a search to get the size of the file. Something like this
Filters:
Internal Id = internal id of the attachment
Results:
Size
The Size column would return the file size in KB.

Set metadata in REST request to put blob in AZURE

i am able to upload file to azure blob using REST api provided by Azure.
i want to set metadata at the time i am doing request for put blob, when i am setting it into header as shown here i am unble to upload file and getting following exception org.apache.http.client.ClientProtocolException.
from the last line of the code below
HttpPut req = new HttpPut(uri);
req.setHeader("x-ms-blob-type", blobType);
req.setHeader("x-ms-date", date);
req.setHeader("x-ms-version", storageServiceVersion);
req.setHeader("x-ms-meta-Cat", user);
req.setHeader("Authorization", authorizationHeader);
HttpEntity entity = new InputStreamEntity(is,blobLength);
req.setEntity(entity);
HttpResponse response = httpClient.execute(req);
regarding the same, i have two questions.
can setting different metadata, avoid overwriting of file? See my question for the same here
if Yes for first question, how to set metadata in REST request to put blob into Azure?
please help
So a few things are going here.
Regarding the error you're getting, it is because you're not adding your metadata header when calculating authorization header. Please read Constructing the Canonicalized Headers String section here: http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx.
Based on this, you would need to change the following line of code (from your blog post)
String canonicalizedHeaders = "x-ms-blob-type:"+blobType+"\nx-ms-date:"+date+"\nx-ms-version:"+storageServiceVersion;
to
String canonicalizedHeaders = "x-ms-blob-type:"+blobType+"\nx-ms-date:"+date+"\nx-ms-meta-cat"+user+"\nx-ms-version:"+storageServiceVersion;
(Note: I have just made these changes in Notepad so they may not work. Please go to the link I mentioned above for correctly creating the canonicalized headers string.
can setting different metadata, avoid overwriting of file?
Not sure what you mean by this. You can update metadata of a blob by performing Set Blob Metadata operation on a blog.

Resources