I am working on a remix project and have gotten into an issue where a loader requesting data from a foreign endpoint encoded with gzip do not seem to be decoded.
The remix loader is fairly simple, with some simplification it looks like this:
export const loader = async () => {
try {
const [
encodedData,
[... <other responses>]
] = await Promise.all([
gzippedEndpoint(),
[... <other requests>]
]).catch((e) => {
console.error(e);
});
return json([<loader data>]);
} catch (error) {
console.log("ERROR:", error);
return json({});
}
};
It's the gzippedEndpoint() that fails, where the error stack claims that the returned data is not valid json. I figured compression should not be a problem, but it seems like the fetch requests on the remix server side cannot correctly decode the gzipped data. I also see no option to enable decoding explicitly on remix. When I disable gzip on the foreign endpoint everything works fine for the remix server making the request and parsing the response.
Here is an example of the headers from a returned response (with some obfuscation):
200 GET https://dev.server.com/public/v1/endpoint {
'cache-control': 'no-store, must-revalidate, no-cache',
connection: 'close',
'content-encoding': 'gzip',
'content-type': 'application/json',
date: 'Mon, 12 Sep 2022 06:51:41 GMT',
expires: 'Mon, 12 Sep 2022 06:51:41 GMT',
pragma: 'no-cache',
'referrer-policy': 'no-referrer',
'strict-transport-security': 'max-age=31536000 ; includeSubDomains',
'transfer-encoding': 'chunked',
}
Is there some remix option or request header that I am missing here?
Related
I am trying to upload a data source to my tableau site using the following code:
const request = require('request')
const fs = require('fs')
const options = {
url: 'https://prod-apnortheast-a.online.tableau.com/api/3.12/sites/578341b7-325c-4dda-be08-ff0b10c4b18c/datasources',
headers: {
Authorization: 'Bearer xxxxxxxxxxxxxx',
'Content-Type': 'application/xml;charset=UTF-8',
boundary: 'boundary-string'
},
body: `--boundary-string
Content-Disposition: name="request_payload"
Content-Type: text/xml
<tsRequest>
<datasource name="datasource-name"
description="datasource-description">
<connectionCredentials name="xxxx#abc.com"
password="xxxxxx"/>
<project id="96ec15ee-eef1-41ac-beda-1f03d9209305" />
</datasource>
</tsRequest>
--boundary-string
Content-Disposition: name="tableau_datasource"; filename="datasource-file-name"
Content-Type: application/octet-stream
This is the content of data source file.
Hello from Mayank
--boundary-string--`
}
function callback (error, response, body) {
if (error) {
console.log(error)
}
console.log( body, response.headers, response.statusCode)
}
request.post(options, callback)
But instead of a successful upload, I get the following error:
Response Body:
<?xml version='1.0' encoding='UTF-8'?><tsResponse xmlns="http://tableau.com/api" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://tableau.com/api http://tableau.com/api/ts-api-3.12.xsd"><error code="406000"><summary>Bad Request</summary><detail>Content type 'application/xml;charset=UTF-8' not supported</detail></error></tsResponse>
Response Headers:
{
'content-type': 'application/xml;charset=UTF-8',
date: 'Mon, 30 Aug 2021 08:17:57 GMT',
p3p: 'CP="NON"',
'referrer-policy': 'strict-origin-when-cross-origin',
server: 'Tableau',
'set-cookie': [
'hid=pdanaa-hap01; domain=.prod-apnortheast-a.online.tableau.com; path=/; HttpOnly; Secure; SameSite=None',
'AWSELB=05DBF7950E7E74D8AC3E3765F2EF65B6BB96F639EDB7A6D781435ACF3E27CEC2643898FB33239EFFBCA90E45D6EE0951AC6ECA4251ACA4E386D74627D58239403899B395F5F04C31144F69D44D5789C3FA7D6D9DC6;PATH=/;DOMAIN=.prod-apnortheast-a.online.tableau.com;SECURE;HTTPONLY;SAMESITE=None'
],
'strict-transport-security': 'max-age=31536000; includeSubDomains',
tableau_error_code: '0xE3C7443A',
tableau_error_source: 'NeedsClassification',
tableau_service_name: 'vizportal',
tableau_status_code: '2',
'x-content-type-options': 'nosniff',
'x-tableau': 'Tableau Server',
'x-ua-compatible': 'IE=Edge',
'x-xss-protection': '1; mode=block',
'content-length': '365',
connection: 'Close'
}
Response Status Code: 406
References:
https://help.tableau.com/current/api/rest_api/en-us/REST/rest_api_ref_publishing.htm#publish_data_source
https://help.tableau.com/current/api/rest_api/en-us/REST/rest_api_ref.htm
According to Tableau's documentation here
The Content-Type header for the request must be set to multipart/mixed; boundary=boundary-string.
Although I'm using PHP, I had this exact same error as you while trying to publish flows using the API. Changing to "multipart/mixed" in the Content-Type fixed it for me (wasn't the end of my problems though). So maybe changing your code to
...
headers: {
Authorization: 'Bearer xxxxxxxxxxxxxx',
'Content-Type': 'multipart/mixed; boundary=boundary-string'
},
...
could do the trick.
Please notice the content type change and that boundary is not a header by itself like you did.
Just note, your example code as it is might still generate 400 errors since that's not a valid data source file. Since I run Tableau Server I had to do a lot of troubleshooting by tailing the logs at the server and try to figure what was wrong.
In my React app I sending requests to my backend node / express app using axios. In my local environment, everything works well when I call it using a function that looks like this:
await axios.post('/createproduct', createProductBody).then(res => console.log('Data send')).catch(err => console.log(err.data))
However, this after I push my code to production, this line of code returns a status code 405 (see screenshot)
Could it be because I have added the "proxy": "http://localhost:3001" line in my package.jsonfile? I don't quite understand why this would work in locally but not in prod.
Thanks
According to the output of the below, your AWS S3 bucket is not configured to allow POST:
axios = require('axios')
function createProductBody() {
return {}
}
async function stackOverflowQuestionNumber67929785() {
await axios.post('https://editor.blankt.io/createproduct', createProductBody()).then(res => console.log('Data send')).catch(err => console.log(err))
}
stackOverflowQuestionNumber67929785();
Prints:
response: {
status: 405,
statusText: 'Method Not Allowed',
headers: {
'content-type': 'application/xml',
'transfer-encoding': 'chunked',
connection: 'close',
allow: 'HEAD, DELETE, GET, PUT',
date: 'Fri, 11 Jun 2021 00:58:55 GMT',
server: 'AmazonS3',
'x-cache': 'Error from cloudfront',
via: '1.1 cf2a58a1ade01b9796df7d87fe311e64.cloudfront.net (CloudFront)'
}
See the allow response header.
I am using Unirest middleware from inside my node js script to make a GET request. However for some reason I am getting stale/old data from the resource being requested.
Even after updating the data, I am getting stale data from the resource.
Here's the code :
let unirest = require('unirest');
unirest.get('<resource_url>')
.headers({"cache-control": "no-cache"})
.end(function (response) {
console.log('body===>',JSON.stringify(response.body));
console.log('status=====>',response.status);
console.log('response headers=====>',response.headers);
});
response headers=====> { 'strict-transport-security': 'max-age=15768000; includeSubDomains ',
date: 'Fri, 15 Sep 2017 18:58:40 GMT',
cached_response: 'true',
'cache-control': 'no-transform, max-age=3600',
expires: 'Fri, 15 Sep 2017 12:10:53 GMT',
vary: 'Accept,Accept-Encoding',
'content-length': '1383',
connection: 'close',
'content-type': 'application/json',
'content-language': 'en-US' }
The same resource gives updated data instantly when tried via Python scipt or CURL.
Note : After some time say 3hrs, the node js script gives updated data.
I'm trying to write a very simple solution to download and parse a calendar file from my Airbnb. Airbnb provides the calendar in ical format, with a unique url for each user such as:
https://www.airbnb.com/calendar/ical/1234.ics?s=abcd
Where those numbers (1234/5678) are unique hex keys to provide some security.
Whenever I hit my (private) url it replies instantly with an ical if I'm using a browser. I can be using any browser, even one from a different country that has never visited airbnb.com before. (I've got remote access to a server I tried it from when debugging.)
In nodejs it works only about 10% of the time. Most of the time I get a 403 error with the text of You don't have permission to access (redacted url) on this server.
Example code:
const request = require('request');
request.get(url, (error, response, body) => {
if (!error && response.statusCode === 200) {
return callback(null, body);
}
return callback('error');
});
This is using the request package here: https://github.com/request/request
I've set it up in an async.whilst loop and it takes about 50 tries to pull down a success, if I set a multi-second delay between each one. (Btw, https://github.com/caolan/async is awesome, so check that if you haven't.)
If it failed EVERY time, that'd be different, but the fact that it fails only occasionally really has me stumped. Furthermore, browsers seem to succeed EVERY time as well.
curl [url] also works, every time. So is there something I'm not specifying in the request that I need to?
Edit 1:
As requested, more of the headers from the reply. I also thought it was rate-limiting me at first. The problem is this is all from the same dev-box. I can curl, or request from a browser without issue multiple times. I can come back in 24 hours and use the nodejs code and it'll fail the first time, or first 50 times.
headers:
{ server: 'AkamaiGHost',
'mime-version': '1.0',
'content-type': 'text/html',
'content-length': '307',
expires: 'Wed, 24 May 2017 17:23:28 GMT',
date: 'Wed, 24 May 2017 17:23:28 GMT',
connection: 'close',
'set-cookie': [ 'mdr_browser=desktop; expires=Wed, 24-May-2017 19:23:28 GMT; path=/; domain=.airbnb.com' ] },
rawHeaders:
[ 'Server',
'AkamaiGHost',
'Mime-Version',
'1.0',
'Content-Type',
'text/html',
'Content-Length',
'307',
'Expires',
'Wed, 24 May 2017 17:23:28 GMT',
'Date',
'Wed, 24 May 2017 17:23:28 GMT',
'Connection',
'close',
'Set-Cookie',
'mdr_browser=desktop; expires=Wed, 24-May-2017 19:23:28 GMT; path=/; domain=.airbnb.com' ],
trailers: {},
rawTrailers: [],
upgrade: false,
url: '',
method: null,
statusCode: 403,
statusMessage: 'Forbidden',
I am trying to get JSON data for a Trello board using the following URL, using Node.js's https module:
https://trello.com/b/nC8QJJoZ.json
Here's my code:
var https = require('https');
https.get('https://trello.com/b/nC8QJJoZ.json', function (res) {
console.log('statusCode:', res.statusCode);
console.log('headers:');
console.log(res.headers);
res.setEncoding('utf8');
res.on('data', function (chunk) {
console.log(chunk);
});
}).on('error', function (e) {
console.log('ERROR: ' + e);
});
Although the URL works perfectly in browser, It returns a body containing the string "invalid key", with a 401 status. Following is the output:
statusCode: 401
headers:
{ 'cache-control': 'max-age=0, must-revalidate, no-cache, no-store',
'x-content-type-options': 'nosniff',
'strict-transport-security': 'max-age=15768000',
'x-xss-protection': '1; mode=block',
'x-frame-options': 'DENY',
'x-trello-version': '1.430.0',
'x-trello-environment': 'Production',
'set-cookie':
[ 'dsc=ae78a354044f982079cd2b5d8adc4f334cda679656b3539ee0adaaf019aee48e; Path=
'visid_incap_168551=/NYMaLRtR+qQu/H8GYry1BCKl1UAAAAAQUIPAAAAAAC1zWDD1JLPowdC
'incap_ses_218_168551=+/2JSB4Vz0XJO/pWbX4GAxCKl1UAAAAA0pAbbN5Mbs4tFgbYuskVPw
expires: 'Thu, 01 Jan 1970 00:00:00',
'content-type': 'text/plain; charset=utf-8',
'content-length': '12',
etag: 'W/"c-b1ec112"',
vary: 'Accept-Encoding',
date: 'Sat, 04 Jul 2015 07:24:00 GMT',
'x-iinfo': '1-11281210-11279245 PNNN RT(1435994639565 404) q(0 0 0 -1) r(3 3) U
'x-cdn': 'Incapsula' }
invalid key
What am I doing wrong?
Well, it turns out that we need to provide a Trello API application key (generated from here) with our request.
var https = require('https');
var KEY = '<replace this with your app key>';
https.get('https://trello.com/b/nC8QJJoZ.json?key=' + KEY, function (res) {
...
});
This seems to me a weird requirement because we are not using Trello's API endpoint. (Even if I solved the problem, I would still like to know why a browser can access the resource, but a server side script cannot.)