ask a server to not gzip content - node.js

Is it possible with an http request header to ask the server not to gzip content?
I'm using node.js request.get library to make an API call and it appears the content is coming back as gzipped.
Its only a problem with 1 api (i call several) and I'm thinking maybe their server is misconfigured. But I wanted to try asking the server for a non-gzipped version.
Here is the response I'm getting:
GET https://www.itbit.com/api/feeds/ticker/XBTUSD
4R�&HTpȇ��{3y�L�3��SJ)$�Qj��)�w\d�P�����('t]{�d#�������?� �ŔŅ�2�1Y��_�-X%�uS��}��Y���`���gN?
�-sP��rr6�.셢$�h��]������h�>�����<]#�mx-�����d ��鑈�`��+fos�r��%�����~G�c���E)���̓5pqXK�h�S����<��,M�F�P�n�'��#��+#��]琛����Ʒ{q���܀�6u*�lygnؓ�������z��ë>X�� �rS).����s!Z�U�"Fg��:zL �����mx�W�_ѯ���^�
<l��ۊp?�t��H�1ǎ�e-��zCw�#�e�4�r�ke�z����zN��o�8����5�������\B<3��HL~g!�I��ȥ��.贡h_�aE�]X~��E����_���/7���h Ia�����3���H:\�Âi����l��2�;]w;ގ:��\���s���(�4�hV咸�q�/g�v�

Assuming I'm understanding your problem correctly, you can explicitly provide a value for the Accept-Encoding header in your HTTP GET request.
request({
url: '...',
headers: {
'Accept-Encoding': 'identity'
}
}, function(err, res, body){
});
This assumes that the server you are requesting from respects the Accept-Encoding header. If it doesn't, then your only option would be to just unzip the content.
var zlib = require('zlib');
var req = request.get(...);
req.on('response', function(res){
var stream;
if (res.headers['content-encoding'] === 'gzip'){
stream = res.pipe(zlib.createGunzip());
} else {
stream = res;
}
var chunks = [];
stream.on('readable', function(){
var chunk;
while ((chunk = stream.read()) !== null) chunks.push(chunk);
});
stream.on('end', function(){
var body = Buffer.concat(chunks);
// Do what you'd normally do.
});
});
This is how you would conditionally unzip a request based on the content encoding. That said, this API looks pretty inconsistent, since running this with the URL you gave returns a stack trace. As #robertklep pointed out, they seem to do some user-agent checking too, so it seems like this API isn't really designed for public consumption.

It's a very strange server indeed.
This seems to prevent it from sending back gzipped content:
request({
url : 'https://www.itbit.com/api/feeds/ticker/XBTUSD',
headers : { 'User-Agent' : '' }
}, ...);
(or some other random User-Agent header; it might be caching requests based on certain HTTP headers, and randomizing those headers may prevent it from serving already-cached gzipped responses)

Related

Sending Form Data with the native node module

for my current project I have to send form-data from my lambda function to an api endpoint. The api endpoint essentially expects two images (that it compares with one another) and a key. As mentioned before, I somehow seem unable to send the correct form-data to the api endpoint. I checked out postman, and it seems to have worked alright, but something doesn't seem to work in my function. I presume it must be related the form-data string that I'm sending. Below you can find a shortened version of the function (I excluded the two image files), but somehow I'm getting an error back telling me that the api cannot read the key property:
const http = require('http');
const https = require('https');
const httpPromise = (protocol, params, postData) => {
return new Promise((resolve, reject) => {
const requestModule = protocol === 'http' ? http : https;
const req = requestModule.request(params, res => {
// grab request status
const statusCode = res.statusCode;
if(statusCode < 200 || statusCode > 299) {
throw new Error('Request Failed with Status Code:', statusCode);
}
let body = '';
// continuosly update data with incoming data
res.setEncoding('utf8');
res.on('data', data => body += data);
// once all data was received
res.on('end', () => resolve(body));
})
// write data to a post request
if(typeof(params.method) === 'string' && params.method === 'POST' && postData) {
req.write(postData)
}
// bind to the error event
req.on('error', err => reject(err));
// end the request
req.end();
})
}
const controller = async () => {
const apiKey = "00000000";
const options = {
hostname: '***"
port: 80,
path: '***'
method: 'POST',
headers: {"content-type": "multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW"}
}
const postData = "------WebKitFormBoundary7MA4YWxkTrZu0gW\r\nContent-Disposition: form-data; name=\"key\"\r\n\r\00000000\r\n------WebKitFormBoundary7MA4YWxkTrZu0gW--"
let result = await httpPromise('http', options, postData)
console.log(result);
}
yeah, so somehow it just doesn't seem to recognise the key in the postData string. I have tried various different combinations but just can't seem to get this to work.
The default http and https libraries are kind of wordy and annoying.
Would recommend using the request library instead. Read more here
In which case, to make the request, you can simply write it as :
var request = require('request');
var formData = {
// Pass a simple key-value pair
my_field: 'my_value',
}
request.post({url:'http://service.com/upload', formData: formData}, (err, response, body) => {
// Handle response here
});
Alright, so for anyone who might also face the same issue, it took me a little but figured out what the issue was. I didn't set the Content-Length header, which then in turn meant that node automatically added the Transfer-Encoding Header and set its value to chunk. This broke the receiving api and resulted in the issue. Setting the Content-Length header to the correct length and setting the Transfer-Encoding Header to an empty string solved my issue here (but I think one could also simply omit the transfer-encoding header once you defined the Content-Length Header).

How to convert gzip stream into a readable content and pipe it out in the request?

I am trying to make a proxy call to a website (somesite.com) and get the html from it. somesite.com is chucked and it is zgipped so I was unable to parse buffer (responseFromServer in my code) to be html (currently i get bunch of zumbbled string when i do res.write).
i tried res.end and res.send but neither of them work.
function renderProxyRequest(req, res) {
// somesite.com is gzipped and also is chunked.
var options = {
protocol: 'http:',
hostname: 'somesite.com',
// passing in my current headers
headers: req.headers,
maxRedirects: 0,
path: req.url,
socketTimeout: 200000,
connectTimeout: 1800,
method: 'GET'
}
var proxyrequest = someProxyApi.request(options);
proxyrequest.on('response', function (postresponse) {
// postresponse is a buffer
//postresponse.pipe(res);
var responseFromServer = ''
postresponse.on('data', function (data) {
responseFromServer += data;
});
postresponse.on('end', function () {
// getting some jumbled string onto the browser.
res.write(responseFromServer);
res.end();
})
});
req.pipe(proxyrequest);
}
If postresponse is a stream, you can probably do something like this:
const zlib = require('zlib');
...
postresponse.pipe(zlib.createGunzip()).pipe(res);
You have to check if the response is gzipped to begin with, by checking the Content-Encoding header from the remote server.
Alternatively, if you pass the original headers from the remote server to the client you're proxying the request for, you should be able to just pass the response data along as-is (because the original headers will tell the client that the data was gzipped). This obviously depends on the client being able to handle compressed responses (browsers will).

Getting an inexplicable 500 error when accessing rss endpoint from node

I'm making a simple request to an rss feed:
var request = require('request');
var req = request('http://www.govexec.com/rss/contracting/');
req.on('error', function (error) {
console.log('error:',arguments);
});
req.on('response', function (res) {
var stream = this;
if (res.statusCode != 200) return this.emit('error', new Error('Bad status code'),res.statusCode);
});
output is error: { '0': [Error: Bad status code], '1': 500 }
However, if I hit the url from the browser, or do a simple curl request, I get the correct response
curl 'http://www.govexec.com/rss/contracting/'
It's not a programming problem, per say.
Most websites do expect you to send the header user-agent with your request. It seems to be this way with the website you have provided too.
Fixing this is trivial, since you can use include the user-agent like so:
var req = request({
url:'http://www.govexec.com/rss/contracting/',
headers: {
'User-Agent': 'request'
}
});

response.on('end') called before all chunks are received, only on my computer

I am doing the following fairly simple request in series, for a certain amount of variables (~100) (the problem also happens for different hostnames and paths, but always when a lot of similar requests are sent to the same host)
var http = require('http');
var request = http.request({
hostname: 'lookup.dbpedia.org',
path: '/api/search/KeywordSearch?QueryString=' + [VARIABLE],
headers: {
'Accept': 'application/json'
}
});
request.on('response', function (response) {
var data = [];
response.on('data', function (chunk) {
data.push(chunk);
});
response.on('end', function () {
var object = JSON.parse(data.join(''))
});
});
request.end();
The problem is that the last chunk of the response is sometimes not caught by response.on('data'), so the JSON.parse gives an error.
What makes it extra weird is that this only happens on my computer, and not on another (both are Windows 7).
The problem persists for node v0.12.2 and v0.12.7, and as far I could test, this is a node issue, as something like curl always returns the entire response.
Any thoughts?
It turned out it was my antivirus real-time shield who blocked the final chunk of a request (I use Avast). So in a way, it is similar to this question

Node.js HTTP request: How to detect response body encoding?

I am using https.request() to make a HTTPS request using the following familiar pattern:
var request = https.request(options, function (response) {
var chunks = [];
response.on('data', function (chunk) {
chunks.push(chunk);
});
response.on('end', function () {
var buffer = Buffer.concat(chunks);
...
});
});
...
request.end();
...
Once I have the finished response Buffer, it needs to be packaged into a JSON object. The reason for this is because I am creating a kind of tunnel, whereby the HTTP response (its headers, status, and body) are to be sent as JSON through another protocol.
So that both textual and binary responses may be supported, what works for me so far is to encode the Buffer to Base64 (using buffer.toString('base64')) and unencode it at the other end using new Buffer(theJsonObject.body, 'base64'). While this works, it would be more efficient if I could selectively only perform Base64 encoding if the HTTP request response is known to be of binary type (e.g. images). Otherwise, in the https.request() callback shown above, I could simply do chunk.toString() and convey the response body in the JSON object as a UTF-8 string type. My JSON object would probably contain an additional property that indicates to the opposite end of the tunnel whether the 'body' is a UTF-8 string (e.g. for .htm, .css, etc.) or a Base64-encoded (e.g. images).
What I could do is try to use the MIME type in the response content-type header to work out whether the response is going to be binary. I would probably maintain a 'white list' of types that I know it's safe to assume are UTF-8 (such as 'text/html' and so on). All others (including e.g. 'image/png') would be Base64-encoded.
Can anyone propose a better solution?
Could you use the file-type package to detect the file type by checking the magic number of the buffer?
Install
npm install --save file-type
Usage
var fileType = require('file-type');
var safeTypes = ['image/gif'];
var request = https.request(options, function (response) {
var chunks = [];
response.on('data', function (chunk) {
chunks.push(chunk);
});
response.on('end', function () {
var buffer = Buffer.concat(chunks);
var file = fileType(buffer) );
console.log( file );
//=> { ext: 'gif', mime: 'image/gif' }
// mime isn't safe
if ( safeTypes.indexOf(file.mime) == '-1' ) {
// do your Base64 thing
}
});
});
...
request.end();
...
If you want to keep your code package free have a look at the package source on Github, it's pretty minimal.

Resources