I've set up my simple website with valid Let's Encrypt SSL certificate (from certbot). My nginx config is very short and trivial.
Website shows up correctly in latest Firefox. It shows 404 page, which is OK to me and should work as expected: 404 page.
If I try Google Chrome, i get an error:
The webpage at https://example.org/ might be temporarily down or it
may have moved permanently to a new web address.
ERR_INVALID_SIGNED_EXCHANGE
I assume that the application/signed-exchange header may cause this.
What is this header and should i remove it from response?
Request
GET / HTTP/1.1
Host: example.org
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9,ru;q=0.8
DNT: 1
example.org example.org
Response
HTTP/1.1 404 Not Found
Server: nginx
Date: Fri, 29 Mar 2019 12:05:49 GMT
Content-Type: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
Content-Length: 345
Connection: keep-alive
What to fix?
The Content-Type in the response is incorrect. It should be a single type, as Steffen Ullrich said. For a 404 page, I suspect you want Content-Type: text/html.
This may be something particular to your nginx config. On my server, 404 pages have Content-Type: text/html.
Related
We are running an application (local app) inside another application (moodle) as a plugin.
What happens is when our app tries to access the backend graphql it doesn't add the cookies but the cookie is available on document.cookie before the request is sent. Also before the app is opened in the iFrame there are few requests made from moodle to backend server and the browser adds the cookie to those request.
Following are the request headers
moodle -> backend
Host: graphql.app.home
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 11_0_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: cross-site
Sec-Fetch-Mode: navigate
Sec-Fetch-Dest: iframe
Referer: https://moodle.home:8443/
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cookie: cb_ltiid=s%3AF0FJpsc8bVe9ZqyLzgNgK7flKfGf4W7u.2GL43c7XLV11BzHXCS%2B7AvQKBAS9xg%2BNc7gaj264%2Bks
app (from moodle iFrame) -> backend
Host: graphql.app.home
Connection: keep-alive
Content-Length: 118
Pragma: no-cache
Cache-Control: no-cache
accept: */*
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 11_0_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36
content-type: application/json
Origin: https://app.home
Sec-Fetch-Site: same-site
Sec-Fetch-Mode: cors
Sec-Fetch-Dest: empty
Referer: https://app.home/
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Thanks for help
It was issue with fetch not sending the header because of
fetch won’t send cookies, unless you set the credentials init option.
(Since Aug 25, 2017. The spec changed the default credentials policy
to same-origin. Firefox changed since 61.0b13.)
solution was to pass credentials: 'include' to fetch options
I'm trying to run a web application that I've built in an iframe on another domain. I'm able to load the page within the iframe, but any ajax requests on the page result in a 403 error as per below:
Request URL: https://test.mydomain.com/get_extra_services/
Request Method: POST
Status Code: 403 Forbidden
Remote Address: 207.38.86.14:443
Referrer Policy: no-referrer-when-downgrade
Connection: keep-alive
Content-Length: 1382
Content-Type: text/html
Date: Thu, 18 Jun 2020 22:57:41 GMT
Server: nginx
X-Content-Type-Options: nosniff
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Connection: keep-alive
Content-Length: 22
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
Cookie: _ga=GA1.2.2146753382.1592180975; _gid=GA1.2.1219012919.1592286837
DNT: 1
Host: test.mydomain.com
Origin: https://test.mydomain.com
Referer: https://test.mydomain.com/order/
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36
X-CSRFToken: null
X-Requested-With: XMLHttpRequest
serviceid: 18
checked: 1
Any thoughts on what is causing this error?
Thanks!
You need to setup CORS to make requests from a different domain.
https://enable-cors.org/
I'm trying to use the net module in nodejs to send a POST request while connecting to a socket.
I've tried sending a GET request which indeed does work, but I can't seem to get the POST request to work.
This is the code:
socket.write(`POST /page.php HTTP/1.1\r\n
Host: domain.com\r\n
Connection: close\r\n
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3\r\n
User-Agent: Mozilla/5.0 (MeeGo; NokiaN9) AppleWebKit/534.13 (KHTML, like Gecko) NokiaBrowser/8.5.0 Mobile Safari/534.13\r\n
Upgrade-Insecure-Requests: 1\r\n
Accept-Encoding: gzip, deflate\r\n
Accept-Language: en-US,en;q=0.9\r\n
Cache-Control: max-age=0\r\n
Content-Type: application/x-www-form-urlencoded\r\n
Content-Length: 0\r\n\r\n`);
I've "debugged" it, and I've noticed some 301 Moved Permanently and Temporarily redirected responses from the host.
I have a Server 2012 R2 box running IIS. I've tried enabling compression for several sites running on that box, but I can't figure out why it won't work. My request headers all show accept-encoding, but the response headers are always Transfer-Encoding:chunked and Vary:Accept-Encoding. The following steps have been performed to try to get gzip compression working:
Dynamic and Static compression have been enabled on each site and at the machine level
Both compression methods are installed from Server Manager
Httpcompression and urlcompression nodes have been manually added to web.configs
Mime types are defined for compression
frequentHitThreshold has been set to 1, so all content should be compressed after the first attempt to access it
A trace has been done to see why compression isn't occurring. The only information I have is the code DYNAMIC_COMPRESSION_NOT_SUCCESS with a reason of 1.
Here are the headers:
GET http://redactedservername:8082/ HTTP/1.1
Host: redactedservername:8082
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36
DNT: 1
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Cookie: ASP.NET_SessionId=gnqovt55ggt22lycufudc0ns
`
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/html; charset=utf-8
Vary: Accept-Encoding
Date: Wed, 22 Jun 2016 14:00:57 GMT
Transfer-Encoding: chunked
What other steps can be performed to get compression to work?
Compression was working, but ESET Antivirus was doing its job of monitoring web traffic. This modified the response and I didn't get gzip content encoding as expected. Disabling ESET and testing again showed that compression was functioning.
What is the best way to get IIS to set the headers for woff files so that they can be served from the client browser cache.
I'm working on an MVC .NET site that is hosted in IIS7.5, served through cloudflare with static caching turned on. The site uses a custom woff web font. When requests are made for pages that use these fonts IIS is serving them with the headers shown below. Subsequent requests all look the same. To me it looks like these are not getting cached by the client browser. I'd expect the server to be responding with 304 (Not Modified) and then the browser should serve the woff from its cache.
Request Headers
GET /blah/Content/fonts/AzoSans-Thin-webfont.woff HTTP/1.1
Host: blah.co.uk
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.111 Safari/537.36
Accept: */*
DNT: 1
Referer: http://blah.co.uk/bundles/Content/stylesheets/main?v=f9NXr53WMUdV9DfYJMkEU_5QZZi0g8eB1lB5lqxgdXc1
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Cookie: __cfduid=d96b367152ae58725c15e5946cf1d67f41415385741070; ASP.NET_SessionId=3tcc3e1nd0z005tlknrbph5h; redesign#lang=en
Response Headers
HTTP/1.1 200 OK
Date: Fri, 07 Nov 2014 19:27:57 GMT
Content-Type: application/font-woff
Content-Length: 27728
Connection: keep-alive
Last-Modified: Thu, 06 Mar 2014 10:40:46 GMT
ETag: "0cbfc872839cf1:0"
X-Powered-By: ASP.NET
CF-Cache-Status: HIT
Vary: Accept-Encoding
Expires: Fri, 07 Nov 2014 23:27:57 GMT
Cache-Control: public, max-age=14400
Accept-Ranges: bytes
Server: cloudflare-nginx
CF-RAY: 185bee60092d0a90-LHR
The problem here seems to be that cloudflare is changing the headers. If I bypass cloudflare I get
Cache-Control:max-age=0
and the server responds with a 304 and the browser uses content from its cache.