Node http-proxy - HTTP message contains illegal headers - node.js

Using node http-proxy module, I have no issues when proxying to a host with GET. But when I use POST and proxy to the same host, I get Status Code:400 Bad Request with HTTP message contains illegal headers.
I checked out the proxyReq headers and they look good, the same exact headers used on successful gets.
_headers:
{ 'x-forwarded-proto': 'http',
'x-forwarded-port': '9000',
'x-forwarded-for': '::ffff:10.3.117.47',
'accept-language': 'en-US,en;q=0.8',
'accept-encoding': 'gzip,deflate,sdch',
'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.101 Safari/537.36',
accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
'cache-control': 'no-cache',
pragma: 'no-cache',
connection: 'close',
host: '10.1.1.1:9000',
'x-auth-token': 'tokenStuff'
}
On a side note, if I make the same api call out directly to the proxied host without using the proxy, I don't get this message. Same post and json payload and headers

Answer: It was the http-proxy option xfwd: true that was causing this.

Related

Nativescript Socketio extraHeaders are lost

I try to authenticate my users through my socketio v4 nativescript - nodejs
Setup
"#angular/core": "~8.2.0",
"#triniwiz/nativescript-socketio": "^5.0.0",
"nativescript-angular": "~8.21.0",
Client side
I wrote this app.module.ts
SocketIOModule.forRoot('http://10.0.2.2:3000', {
debug: true,
log: true,
extraHeaders: {
'authorization': `Bearer ${getString('ACCESS_TOKEN')}`
},
}),
And simply this.socketIO.connect();
Problem
The problem is that my server does not receive my extraheaders at all. I can't authenticate properly.
My research
Using Chrome Dev Tool to debug, my websocket requests are not logged at all.
Chrome Dev Tools
By adding console.log('url', args); at line 73 of index.android.js (nativescript-socketio lib).
Above this.socket = io.socket.client.IO.socket(args[0], opts.build());
The headers are correctly present
By inspecting what the server receives.
{
accept: '*/*',
host: '10.0.2.2:3000',
connection: 'Keep-Alive',
'accept-encoding': 'gzip',
'user-agent': 'okhttp/3.12.12'
}
With another tools : SocketIO Online Client Tool
The server receives the headers correctly
{
host: 'localhost:3000',
'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:88.0) Gecko/20100101 Firefox/88.0',
accept: '*/*',
'accept-language': 'fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3',
'accept-encoding': 'gzip, deflate',
authorization: 'Bearer eyJhbGciOiJIU............V4MKGQ',
origin: 'https://amritb.github.io',
connection: 'keep-alive',
'sec-gpc': '1'
}
Finally, I thought it was a CORS problem (which I think is unlikely because there are no errors). My server socket uses the following options. As mentioned in the documentation
{
cors: {
origin: '*',
allowedHeaders: ['authorization']
}
}
}
I have found similar issues, but their solution doesn't change anything.
Has anyone ever faced this problem? Do you have any other ideas? I am out of ideas.
Thanks for helping
Did you try, if it only affects the Authorization header?
And can you try to use withCredentials = true

Headers and cookies not working as expected <Response [403]>

I am trying to access the json response on a link. It is working in my normal browser but not working when I try to access it with python requests and is sending a Cloudflare page as a response rather than json.
So far, I have tried:
Copying the headers from my browser and passing them to my request
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.106 Safari/537.36',
'Accept': 'application/json, text/plain, */*',
'Accept-Language': 'en-US,en;q=0.9',
'Cache-Control': 'max-age=0',
'Referer': 'https://9gag.com/',
'Cookie': '____ri=5012; ____lo=US;'
}
req = requests.get("https://9gag.com/v1/group-posts/group/default/type/hot", headers=headers)
print(req)
Sending cookies seperately
cookies = {
'____ri': '5012', '____lo': 'US',
}
req = requests.get("https://9gag.com/v1/group-posts/group/default/type/hot", headers=headers, cookies=cookies)
print(req)
Both of these approaches are returning <Response [403]>

aws ec2 nodejs empty body sent through axios post request in react

I am deploying a MERN app on AWS EC2. Frontend is rendered perfectly on aws. While calling API(axios.post) empty body is received in backend when called through react frontend. But, when called through postman, the body is received correctly.
I am sharing the details of the request body as received on backend(I printed this for debugging purposes):
POSTMAN method:
Method: POST
Path: /api/auth
Body: { email_id: 'comiiii#gmail.com',
password: 'comiiii#123',
user_type: 'type_com' }
Headers: { host: '64.1.75.248',
'x-real-ip': '113.103.59.237',
'x-forwarded-for': '113.103.59.237',
'x-forwarded-host': 'ec2-64-1-75-248.ap-south-1.compute.amazonaws.com',
'content-type': 'application/json;charset=UTF-8',
connection: 'close',
'content-length': '94',
'user-agent': 'PostmanRuntime/7.26.8',
accept: '*/*',
'postman-token': 'a6ef6f75-716b-4843-a773-8b46d1f28427',
'accept-encoding': 'gzip, deflate, br' }
REACT FRONTEND:
Method: POST
Path: /api/auth
Body: {}
Headers: { host: '64.1.75.248',
'x-real-ip': '213.235.108.12',
'x-forwarded-for': '213.235.108.12',
'x-forwarded-host': 'ec2-64-1-75-248.ap-south-1.compute.amazonaws.com',
'content-type': 'application/json;charset=UTF-8',
connection: 'close',
accept: '*/*',
'access-control-request-method': 'POST',
'access-control-request-headers': 'access-control-allow-origin,content-type',
origin: 'http://ec2-64-1-75-248.ap-south-1.compute.amazonaws.com',
'user-agent':
'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.146 Safari/537.36',
'sec-fetch-mode': 'cors',
referer: 'http://ec2-64-1-75-248.ap-south-1.compute.amazonaws.com/',
'accept-encoding': 'gzip, deflate',
'accept-language': 'en-GB,en-US;q=0.9,en;q=0.8' }
I saw the difference between the two and realised that the headers which are different are part of forbidden ones. (Refer : https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_header_name)
How to solve this?

Some req.session variables not saving (express-session)

I am setting up an OAuth2 flow that needs to send some extra variables to the callback. When I check req.session in the authorize call, the values for the variables are properly set. The next step is to hit the "callback" route.
router.post('/api/personas/authorize', async (req, res) => {
const source = await personaController.getSourceInformation(req.body.source)
req.session.exchangeURL = source.token_exchange_url;
req.session.exchangeFields = source.exchange_fields;
console.log('REQ', req.session)
req.session.save()
const url = constructAuthURL(source)
res.send(url);
})
In the console.log('REQ') call, both the exchangeURL and exchangeFields variables are set properly.
Then, we hit the callback route:
router.get('/callback', async (req, res) => {
console.log(req.session)
}
Those exchange variables are not set. The sessionIDs are the same between the calls and the sid cookie is being sent in both requests. Additionally, in both requests, there are some custom variables that are retained between the calls. It appears that just the variables set at /authorize are not persisted.
Any help would be greatly appreciated and I am happy to provide more info!
EDIT:
Here are the headers from the request that hits 'api/personas/authorize':
{
host: 'localhost:3000',
connection: 'keep-alive',
'content-length': '15',
accept: 'application/json, text/plain, */*',
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.75 Safari/537.36',
'content-type': 'application/json;charset=UTF-8',
origin: 'http://localhost:3000',
'sec-fetch-site': 'same-origin',
'sec-fetch-mode': 'cors',
'sec-fetch-dest': 'empty',
referer: 'http://localhost:3000/personas',
'accept-encoding': 'gzip, deflate, br',
'accept-language': 'en-US,en;q=0.9',
cookie: 'connect.sid=s%3AK01XQwFSwwA_q-D8OxALcx--asG23hsB.%2B8j%2BgLX6Eg%2FHhyh3K5wv%2FqpM6Vmp89xX5Kh8%2FFhLMJg'
}
And here are the headers from the request that hits '/callback':
headers: {
host: 'localhost:3000',
connection: 'keep-alive',
'upgrade-insecure-requests': '1',
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.75 Safari/537.36',
accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
'sec-fetch-site': 'cross-site',
'sec-fetch-mode': 'navigate',
'sec-fetch-dest': 'document',
referer: 'https://{XXXX}.salesforce.com/',
'accept-encoding': 'gzip, deflate, br',
'accept-language': 'en-US,en;q=0.9',
cookie: 'connect.sid=s%3AK01XQwFSwwA_q-D8OxALcx--asG23hsB.%2B8j%2BgLX6Eg%2FHhyh3K5wv%2FqpM6Vmp89xX5Kh8%2FFhLMJg'
}

NodeJs request.get() function not working while the url is accessible from the browser

I am using the request npm module.I want to retrieve an image from a url. The request.get(url) function is returning me a '400 Bad Request', whereas the image is accessible from the browser.
The url i am hitting is : http://indiatribune.com/wp-content/uploads/2017/09/health.jpg
You could try to add some headers:
const request = require('request');
request.get({
url: 'http://indiatribune.com/wp-content/uploads/2017/09/health.jpg',
headers: {
Accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate',
'Accept-Language': 'en-GB,en;q=0.8,en-US;q=0.6,hu;q=0.4',
'Cache-Control': 'max-age=0',
Connection: 'keep-alive',
Host: 'indiatribune.com',
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36',
},
}, (err, response, data) => {
console.log(response, data);
});
The User-Agent seems to be enough.
Use download module . It's pretty simple.
const fs = require('fs');
const download = require('download');
download('http://indiatribune.com/wp-content/uploads/2017/09/health.jpg').pipe(fs.createWriteStream('foo.jpg'));

Resources