How to transfer image from server to client with node http header size restrictions - node.js

Transferring image (base64 encoded, created with Mapguide server) to client. I am able to output the image to the console and test it is correct. Using Node with npm and Vite for develpment web server. When I try to set imgLegend.src = data; I get this error "431 (Request Header Fields Too Large)" I believe it is the Node default max-http-header-size causing the problem. Have attempted to set --max-http-header-size=80000 with no luck. I am starting my dev server in package.json file like this: "start": "vite --host 0.0.0.0",
Does anyone know of a way around this or a better way to transfer the image from server to client?
here is the relevant code.
Client side:
//add legend
const mapVpHeight = document.getElementById('map').clientHeight;
var url = mgServer + "/Cid_Map/LayerManager.aspx/GetLegendImage";
var values = JSON.stringify({ sessionId: sessionId, mgMapName: mapName, mapVpHeight: mapVpHeight });
var imgLegend = new Image();
//console.log(values);
$.ajax({
url: url,
type: "POST",
contentType: "application/json; charset=utf-8",
data: values,
dataType: 'html',
success: function (data) {
console.log(data); //
imgLegend.src = data; //node.js won't allow http header as large as this image, about 18kb
},
error: function (xhr, textStatus, error) {
console.log(textStatus);
}
});
Server Side:
[WebMethod]
public static string GetLegendImage(string sessionId, string mgMapName, int mapVpHeight)
{
string tempDir = System.Configuration.ConfigurationManager.AppSettings["tempDir"];
string legFilePath = tempDir + sessionId + "Legend.png";
string configPath = #"C:\Program Files\OSGeo\MapGuide\Web\www\webconfig.ini";
MapGuideApi.MgInitializeWebTier(configPath);
MgUserInformation userInfo = new MgUserInformation(sessionId);
MgSiteConnection siteConnection = new MgSiteConnection();
siteConnection.Open(userInfo);
MgMap map = new MgMap(siteConnection);
MgResourceService resourceService = (MgResourceService)siteConnection.CreateService(MgServiceType.ResourceService);
map.Open(resourceService, mgMapName);
MgColor color = new MgColor(226, 226, 226);
MgRenderingService renderingService = (MgRenderingService)siteConnection.CreateService(MgServiceType.RenderingService);
MgByteReader byteReader = renderingService.RenderMapLegend(map, 200, mapVpHeight, color, "PNG");
MgByteSink byteSink = new MgByteSink(byteReader);
byteSink.ToFile(legFilePath);
//try this
//byte[] buffer = new byte[byteReader.GetLength()]; //something doesn't work here byteReader doesn't give comeplete image
//byteReader.Read(buffer, buffer.Length);
//loading image file just created, converting to base64 image gives correct image
string legendImageURL = "";
using (Stream fs = File.OpenRead(legFilePath))
{
BinaryReader br = new System.IO.BinaryReader(fs);
byte[] bytes = br.ReadBytes((int)fs.Length);
string strLegendImage = Convert.ToBase64String(bytes, 0, bytes.Length);
legendImageURL = "data:image/png;base64," + strLegendImage;
}
byteReader.Dispose();
byteSink.Dispose();
return legendImageURL;
//return buffer;
}

The 431 status code complains about the header length of your request ..
trace the request in your browsers dev tool network tab and study the header fields in your request in some special cases if your cookies get set to often with unique key value pairs this could be the problem...
May be you can copy and share the request response from your browsers network tab to provide some detailed information... especially the request response of the endpoint and look up the cookie/session storage maybe you find some suspicious stuff.
Good look :)

Related

How to configure the user_token of Damn Vulnerable Web Application within CSRF field while Script based authentication using ZAP?

I had been following the documentation of Script Based Authentication for Damn Vulnerable Web Application using ZAP. I have navigated to http://localhost/dvwa/login.php through Manual Explore which opens up the DVWA application on my localhost as follows:
and adds the URL to the Default Context.
I've also created the dvwa script with the following configuration:
and modified the dvwa script:
Now when I try Configure Context Authentication, dvwa script does gets loaded but the CSRF field doesn't shows up.
Additionally, POST Data doesn't even shows up but Extra POST Data is shown.
Am I missing something in the steps? Can someone help me out?
The modified script within the documentation of Script Based Authentication section for Damn Vulnerable Web Application using ZAP
seems incomplete.
The complete script is available at Setting up ZAP to Test Damn Vulnerable Web App (DVWA) which is as follows:
function authenticate(helper, paramsValues, credentials) {
var loginUrl = paramsValues.get("Login URL");
var csrfTokenName = paramsValues.get("CSRF Field");
var csrfTokenValue = extractInputFieldValue(getPageContent(helper, loginUrl), csrfTokenName);
var postData = paramsValues.get("POST Data");
postData = postData.replace('{%username%}', encodeURIComponent(credentials.getParam("Username")));
postData = postData.replace('{%password%}', encodeURIComponent(credentials.getParam("Password")));
postData = postData.replace('{%' + csrfTokenName + '%}', encodeURIComponent(csrfTokenValue));
var msg = sendAndReceive(helper, loginUrl, postData);
return msg;
}
function getRequiredParamsNames() {
return [ "Login URL", "CSRF Field", "POST Data" ];
}
function getOptionalParamsNames() {
return [];
}
function getCredentialsParamsNames() {
return [ "Username", "Password" ];
}
function getPageContent(helper, url) {
var msg = sendAndReceive(helper, url);
return msg.getResponseBody().toString();
}
function sendAndReceive(helper, url, postData) {
var msg = helper.prepareMessage();
var method = "GET";
if (postData) {
method = "POST";
msg.setRequestBody(postData);
}
var requestUri = new org.apache.commons.httpclient.URI(url, true);
var requestHeader = new org.parosproxy.paros.network.HttpRequestHeader(method, requestUri, "HTTP/1.0");
msg.setRequestHeader(requestHeader);
helper.sendAndReceive(msg);
return msg;
}
function extractInputFieldValue(page, fieldName) {
// Rhino:
var src = new net.htmlparser.jericho.Source(page);
// Nashorn:
// var Source = Java.type("net.htmlparser.jericho.Source");
// var src = new Source(page);
var it = src.getAllElements('input').iterator();
while (it.hasNext()) {
var element = it.next();
if (element.getAttributeValue('name') == fieldName) {
return element.getAttributeValue('value');
}
}
return '';
}
Using this script, CSRF Field and POST Data field shows up just perfect.

export to excel from a byte [] in angular

I have the response json from the backend (spring boot):
private byte[] archivoExcel;
private String extension;
private String mime;
What I need in angular is to get this byte [] and export it to excel, for this my answer json in angular is:
export class RespuestaExportar {
archivoExcel: ArrayBuffer;
extension: string;
mime: string;
}
and in my component.ts file I have:
this.reversionesService.getReversionesExportarSeguimiento(this.solicitud).subscribe(res => { this.respuestaExportar=res; let file = new Blob([this.respuestaExportar.archivoExcel], { type: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"});
var fileURL = URL.createObjectURL(file); window.open(fileURL); }
When I run the 'export' button, it consumes correctly, and it downloads the excel, but this file is damaged. Do I need one more step to solve it? or there is another alternative, since I need to get this byte [] from the backEnd.
I think you can guide yourself from this service that I perform for my backend (SpringBoot) I send a responseObject as json
ObjectResponse response = new ObjectResponse();
ByteArrayInputStream in = getExcel();
byte[] array = new byte[in.available()];
in.read(array);
httpStatus = HttpStatus.OK;
response.setResultado(array);
return new ResponseEntity<>(response, httpStatus);
and I get a Json response with the structure that sent the data is that when serializing it, the response.getResult arrives in base64, that is, already transformed to the client that receives it.
So in my frontend (Angular) I proceed to transform it into a Blob and be able to work the file.
this.subcriber = this.reporteService.generarExcel (). subscribe ((b64Data: any) => {
const byteCharacters = atob (b64Data.result);
const byteNumbers = new Array (byteCharacters.length);
for (let i = 0; i <byteCharacters.length; i ++) {
byteNumbers [i] = byteCharacters.charCodeAt (i);
}
const byteArray = new Uint8Array (byteNumbers);
let blob = new Blob ([byteArray], {type: 'application / vnd.openxmlformats-officedocument.spreadsheetml.sheet'});
const url = window.URL.createObjectURL (blob);
const anchor = document.createElement ('a');
anchor.download = `file.xlsx`;
anchor.href = url;
anchor.click ();
messageService = {
state: false,
messages: null
};
}, (err) => {
this.subcriber.unsubscribe ();
})
I hope it helps you

uploaded files to Azure are corrupted when using dio

I'm trying to upload a file from my phone to azure blob storage as a BlockBlob with a SAS. I can get the file to upload, but it can't be opened once downloaded. The file gets corrupted somehow. I thought this was a content-type problem, but I have tried several different approaches to changing to content-type. Nothing has worked so far.
My code:
FileInfo _fileInfo = await filePicker(); // get the file path and file name
// my getUploadInfo fires a call to my backend to get a SAS.
// I know for a fact that this works because my website uses this SAS to upload files perfectly fine
UploadInfo uploadInfo = await getUploadInfo(_fileInfo.fileName, _fileInfo.filePath);
final bytes = File(_fileInfo.filePath).readAsBytesSync();
try {
final response = await myDio.put(
uploadInfo.url,
data: bytes,
onSendProgress:
(int sent, int total) {
if (total != -1) {
print((sent / total * 100).toStringAsFixed(0) + "%");
}
},
options:
dioPrefix.Options(headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': mime(_fileInfo.filePath),
})
);
} catch (e) {
print(e);
}
This code uploads a file just fine. But I can't open the file since it becomes corrupted. At first, I thought this was a Content-Type problem, so I've tried changing the content type header to: application/octet-stream and multipart/form-data as well. That doesn't work.
I've also tried to do
dioPrefix.FormData formData =
new dioPrefix.FormData.fromMap({
'file': await MultipartFile.fromFile(
_fileInfo.filePath,
filename: _fileInfo.fileName,
)
});
...
final response = await myDio.put(
uploadInfo.url,
data: formData, // This approach is recommended on the dio documentation
onSendProgress:
...
but this also corrupts the file. It gets uploaded, but I can't open it.
I have been able to successfully upload a file with this code, but with this approach I cannot get any type of response so I have no idea whether it uploaded successfully or not (Also, I can't get the progress of the upload):
try {
final data = imageFile.readAsBytesSync();
final response = await http.put( // here, response is empty no matter what i try to print
url,
body: data,
headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': mime(filePath),
});
...
Any help would be greatly appreciated. Thanks
I tried to upload a file using dio in Dart to Azure Blob Storage, and then download and print the content of the file, as the code below.
import 'package:dio/dio.dart';
import 'dart:io';
main() async {
var accountName = '<account name>';
var containerName = '<container name>';
var blobName = '<blob name>';
var sasTokenContainerLevel = '<container level sas token copied from Azure Storage Explorer, such as `st=2019-12-31T07%3A17%3A31Z&se=2020-01-01T07%3A17%3A31Z&sp=racwdl&sv=2018-03-28&sr=c&sig=xxxxxxxxxxxxxxxxxxxxxxxxxx`';
var url = 'https://$accountName.blob.core.windows.net/$containerName/$blobName?$sasTokenContainerLevel';
var data = File(blobName).readAsBytesSync();
var dio = Dio();
try {
final response = await dio.put(
url,
data: data,
onSendProgress:
(int sent, int total) {
if (total != -1) {
print((sent / total * 100).toStringAsFixed(0) + "%");
}
},
options: Options(
headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': 'text/plain',
})
);
print(response.data);
} catch (e) {
print(e);
}
Response response = await dio.get(url);
print(response.data);
}
Then, I ran it and got the result as the figure below.
The content of the uploaded file as blob is the json string encoded from a Uint8List bytes from the funtion readAsBytesSync.
I researched the description and the source code of dio, actually I found dio is only suitable for sending the request body of json format, not for raw content as request body.
Fig 1. The default transformer apply for POST method
Fig 2. https://github.com/flutterchina/dio/blob/master/dio/lib/src/transformer.dart
So to fix it is to write a custom transformer class PutTransformerForRawData instead of the default one to override the function transformRequest, as the code below.
import 'dart:typed_data';
class PutTransformerForRawData extends DefaultTransformer {
#override
Future<String> transformRequest(RequestOptions options) async {
if(options.data is Uint8List) {
return new String.fromCharCodes(options.data);
} else if(options.data is String) {
return options.data;
}
}
}
And to replace the default transformer via the code below.
var dio = Dio();
dio.transformer = PutTransformerForRawData();
Then, you can get the data via the code below.
var data = File(blobName).readAsBytesSync();
Or
var data = File(blobName).readAsStringSync();
Note: the custom transfer PutTransformerForRawData is only for uploading, please remove the download & print code Response response = await dio.get(url); print(response.data);, the default transformer seems to check the response body whether be json format, I got the exception as below when my uploaded file is my sample code.
Unhandled exception:
DioError [DioErrorType.DEFAULT]: FormatException: Unexpected character (at character 1)
import 'dart:typed_data';

Use nlapiRequestURL to make a request to a Service

How do you use nlapiRequestURL to make a request to a service? My attempt below is failing with the error: UNEXPECTED_ERROR (output from NetSuites script execution log).
My service is set to run without login and works correctly when I directly access it through a browser using its url. Its just the request through nlapiRequestURL thats failing.
Any idea what could be going wrong?
// This code executes in Account.Model.js (register function)
// I am using my own netsuite user credential here
var cred = {
email: "MY_NETSUITE_EMAIL"
, account: "EXXXXX" // My account id
, role: "3" // Administrator
, password: "MY_NETSUITE_PASSWORD"
};
var headers = {"User-Agent-x": "SuiteScript-Call",
"Authorization": "NLAuth nlauth_account=" + cred.account + ", nlauth_email=" + cred.email +
", nlauth_signature= " + cred.password + ", nlauth_role=" + cred.role,
"Content-Type": "application/json"};
var payload = {
type: 'is_email_valid'
, email: 'spt015#foo.com'
};
// A raw request to the service works fine:
// http://mywebsite.com/services/foo.ss?type=is_email_valid&email=spt015#foo.com
// Error occurs on next line
var response = nlapiRequestURL(url, payload, headers);
You are attempting to call a non-Netsuite url with Netsuite authentication headers. You do not need that unless for some reason of your own you have implemented NS-style authorization on your service.
nlapiRequestURL does not automatically format a payload into a query string. If your service takes a posted JSON body then you need to call JSON.stringify(payload) e.g
var response = nlapiRequestURL(url, JSON.stringify(payload), headers);
If your service needs a query string like in your example then you need to construct a query string and append it to your service url. e.g.
var qs = '';
for(var k in payload) qs += k +'='+ uriEncodeComponent(payload[k]) +'&';
var response = nlapRequestURL(url +'?'+ qs.slice(0,-1), null, headers);
I would suggest changing your nlapiRequestURL to a GET instead of POST, and add the parameters to the url instead. Your function call will look like this instead.
nlapiRequestURL(url, null, headers, "GET")

Using HttpClient to upload files to ServiceStack server

I can't use the ServiceStack Client libraries and I've chosen to use the HttpClient PCL library instead. I can do all my Rest calls (and other json calls) without a problem, but I'm now stucked with uploading files.
A snippet of what I am trying to do:
var message = new HttpRequestMessage(restRequest.Method, restRequest.GetResourceUri(BaseUrl));
var content = new MultipartFormDataContent();
foreach (var file in files)
{
byte[] data;
bool success = CxFileStorage.TryReadBinaryFile(file, out data);
if (success)
{
var byteContent = new ByteArrayContent(data);
byteContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = System.IO.Path.GetFileName(file) ,
};
content.Add(byteContent);
}
}
message.Content = content;
Problem is now that I get a null reference exception (status 500) when posting. I doesn't get into the service. I see the call in the filterrequest, but that's it.
So I'm wondering what I do wrong and how I can pinpoint what is going wrong. How can I catch the correct error on the ServiceStack layer?

Resources