When I request images using node request library of the given url, the images loaded are not complete. After storing the loaded image it looks like https://ibb.co/i5xVAR
However the request finishes without an error and has status code 200. For me it seems that the ssl connection gets closed. Other tools like browser or curl transfer the image complete.
const request = require('request');
const r1 = request({
url: 'https://open.hpi.de/files/f1d16619-9813-4d59-96b3-d84908929b23',
encoding: 'binary'
}, (err, response, body) => {
if (err) {
console.log(err);
return;
}
// complete file should be loaded
// content and body length should match
// read ECONNRESET should not be thrown
console.log('body length', body.length);
console.log('response content length', response.headers['content-length']);
});
The open.hpi.de host is closing the connection prematurely. You can add a Connection: keep-alive header to the request and the connection will remain open until the transfer is actually complete:
const request = require('request');
const r1 = request({
url: 'https://open.hpi.de/files/f1d16619-9813-4d59-96b3-d84908929b23',
encoding: 'binary',
headers: {
"Connection": "keep-alive"
}
}, (err, response, body) => {
// do the things
});
Related
I'm using the request library to send a binary (pdf) file in the body of the request using http post (NOTE: This API does not accept multi-part forms). However, I have only been able to get it to work using fs.readFilesync(). For some reason, when I try to use fs.createReadStream() the pdf file is still sent, but it is empty, and the request never finishes (I never get a response back from the server).
Here is my working version using fs.readFileSync():
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: fs.readFileSync(filename)
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
If I try to replace the body with the below, it doesn't work:
body: fs.createReadStream(filename)
I have also tried piping the http request on to the stream, like it says in the request library docs, but I get the same result:
fs.createReadStream(filename).pipe(request({...}))
I've tried to monitor the stream by doing the following:
var upload = fs.createReadStream('test.pdf');
upload.pipe(req);
var upload_progress = 0;
upload.on("data", function (chunk) {
upload_progress += chunk.length
console.log(new Date(), upload_progress);
})
upload.on("end", function (res) {
console.log('Finished');
req.end();
})
I see progress for the stream and Finished, but still no response is returned from the API.
I'd prefer to create a read stream because of the benefits of working better with larger files, but am clueless as to what is going wrong. I am making sure I'm not altering the file with any special encoding as well.
Is there some way to get some kind of output to see what process is taking forever?
UPDATE:
I decided to test with a simple 1 KB .txt file. I found that it is still empty using fs.createReadStream(), however, this time I got a response back from the server. The test PDF I'm working with is 363 KB, which isn't outrageous in size, but still... weren't streams made for large files anyway? Using fs.readFileSync() also worked fine for the text file.
I'm beginning to wonder if this is an synchronous vs asynchronous issue. I know that fs.readFileSync() is synchronous. Do I need to wait until fs.createReadStream() finishes before trying to append it to the body?
I was able to get this working by doing the following:
const request = require('request');
const fs = require('fs');
const filename = 'test.pdf';
const readStream = fs.createReadStream(filename);
let chunks = [];
readStream.on('data', (chunk) => chunks.push(chunk));
readStream.on('end', () => {
const data = Buffer.concat(chunks);
request({
url: 'http://localhost:8083/api/v1/endpoint',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Accept': 'application/vnd.api+json',
'Content-Disposition': `file; filename="${filename}"`
},
encoding: null,
body: data
}, (error, response, body) => {
if (error) {
console.log('error:', error);
} else {
console.log(JSON.parse(response.body.toString()));
}
});
});
I chunked the data together and concatenated it with a buffer before making the request.
I noticed in the documentation it said this:
The Buffer class was introduced as part of the Node.js API to enable interaction with octet streams in TCP streams, file system operations, and other contexts.
The API I'm calling requires the application/octet-stream header, so I need to use the buffer rather than streaming it directly.
I'm trying to get normal response using request module in node.js
and i've got problem to get response from amazon.com as normal string
i dont know why, but ive got problem only with amazon.com (eg. amazon.it, amazon.co.uk does return normal string).
const request = require('request');
request.get(
{
uri: 'https://www.amazon.com',
encoding: 'utf-8'
},
function (error, response, body) {
console.log(body)
});
The code above return something like:
b��╝��W>�S�Uk��z�=8~r����9|r|P^?}p o╗��l���ߋ�t`ޜ^]��n!��
���U�>>�#�w-z�.��O�����Oo��������y�����g�N�/��{����_>���鳟�=s���w?�z��_W)i�
��;���2��9<�0ٷ8����<=�ϱ��ղ��3�=(�"�ԯ�; �3��=�8�2;=���28����#+,3��0"�+DZ �)�2�<�
���7�(W?�8�9\?�)#'���";�ķ���ܣ�ѽ����|�8 ��╚ ��'
The response returned by Amazon is gzipped. You have to provide the gzip option to your request.
const request = require('request');
request.get(
{
uri: 'https://www.amazon.com',
gzip: true,
},
function (error, response, body) {
console.log(body)
});
I have an API Server and NodeJs Server and when a file is requested NodeJs redirected the request to API Server
API Server Send the File as raw data to NodeJs
and Nodejs redirects the file to the browser
But when I checked the network data using wire shark the packet received at browser is not original as that from API Server (work in case of text files, but not in image, video, pdf, doc etc)
router.get('/GetCaseSupportDocument', function (req, res) {
var MyJsonData = {
DocId:parseInt(req.query.DocId) || 0
};
request({
url: 'http://somedomain/someurl', //URL to hit
method: 'POST',
json: MyJsonData
}, function (error, response, body) {
if (error) {
res.status(200).send('Failed');
} else {
res.status(200).send(body);
}
})
});
Can anyone tell why it changes between NodeJs to Browser?
Is there any better solution for this type of transmission?
Updated After finding solution . This works
router.get('/GetCaseSupportDocument', function (req, res) {
var MyJsonData = {
DocId:parseInt(req.query.DocId) || 0
};
request({
url: Url.CaseService + 'GetCaseSupportDocument', //URL to hit
method: 'POST',
json: MyJsonData
}).pipe(res);
})
There is a simple proxy using streams that you can try:
router.get('/GetCaseSupportDocument', function (req, res) {
var MyJsonData = {
DocId: parseInt(req.query.DocId) || 0
};
// updated the response
request({
url: 'http://somedomain/someurl', //URL to hit
method: 'POST',
json: MyJsonData
}).pipe(res);
});
More details with proxy-ing you can find on the request documentation https://github.com/request/request
I am trying to call an external rest API from node server by using request node module.
let request = require('request');
var options = {
method: 'POST',
url: 'https://somerestURI:3000',
qs: { msg: 'some|data|for|other|server' }
};
request(options, function (error, response, body) {
if (error) throw new Error(error);
console.log(body);
});
If I try to run the above code, query string value is being encoded to
some%7cdata%7cfor%7cother%7cserver
as a result I am not receiving correct response.
But if I fire the same request in POSTMAN. I am receiving the expected output(I think postman is not encoding query string).
So what I want is don't encode the query string value.
Any help would be greatly appreciated.
As answered here, you can disable encoding in qsStringifyOptions
var options = {
method: 'POST',
url: 'https://somerestURI:3000',
qs: { msg: 'some|data|for|other|server' },
qsStringifyOptions: {
encoding: false
}
};
You can use node-rest-client package. It allows connecting to any REST API and get results as javascript Object.
var HttpClient = require('node-rest-client').Client;
var httpClient = new HttpClient();
// GET Call
httpClient.get("http://remote.site/rest/xml/method", function (data, response) {
// parsed response body as js object
console.log(data);
// raw response
console.log(response);
});)
or for POST Call
var args = {
data: { test: "hello" },
headers: { "Content-Type": "application/json" }
};
//POST Call
httpClient.post("http://remote.site/rest/xml/method", args, function (data, response) {
// parsed response body as js object
console.log(data);
// raw response
console.log(response);
});
I have used the Winston module to create a daily log file for my offline app. I now need to be able to send or upload that file to a remote server via POST (that part already exists)
I know I need to write the file in chunks so it doesn't hog the memory so I'm using fs.createReadStream however I seem to only get a 503 response, even if sending just sample text.
EDIT
I worked out that the receiver was expecting the data to be named 'data'. I have removed the createReadSteam as I could only get it to work with 'application/x-www-form-urlencoded' and a synchronous fs.readFileSync. If I change this to 'multipart/form-data' on the php server would I be able to use createReadStream again, or is that only if I change to physically uploading the json file.
I've only been learning node for the past couple of weeks so any pointers would be gratefully received.
var http = require('http'),
fs = require('fs');
var post_options = {
host: 'logger.mysite.co.uk',
path: '/',
port: 80,
timeout: 120000,
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}
var sender = http.request(post_options, function(res) {
if (res.statusCode < 399) {
var text = ""
res.on('data', function(chunk) {
text += chunk
})
res.on('end', function(data) {
console.log(text)
})
} else {
console.log("ERROR", res.statusCode)
}
})
var POST_DATA = 'data={['
POST_DATA += fs.readFileSync('./path/file.log').toString().replace(/\,+$/,'')
POST_DATA += ']}'
console.log(POST_DATA)
sender.write(POST_DATA)
sender.end()
After gazillion of trial-failure this worked for me. Using FormData with node-fetch. Oh, and request deprecated two days ago, btw.
const FormData = require('form-data');
const fetch = require('node-fetch');
function uploadImage(imageBuffer) {
const form = new FormData();
form.append('file', imageBuffer, {
contentType: 'image/jpeg',
filename: 'dummy.jpg',
});
return fetch(`myserver.cz/upload`, { method: 'POST', body: form })
};
In place of imageBuffer there can be numerous things. I had a buffer containing the image, but you can also pass the result of fs.createReadStream('/foo/bar.jpg') to upload a file from drive.
copied from https://github.com/mikeal/request#forms
var r = request.post('http://service.com/upload', function optionalCallback (err, httpResponse, body) {
if (err) {
return console.error('upload failed:', err);
}
console.log('Upload successful! Server responded with:', body);
})
var form = r.form()
form.append('my_field1', 'my_value23_321')
form.append('my_field2', '123123sdas')
form.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png')))
Have a look at the request module.
It will provide you the ability to stream a file to POST requests.