In Firefox, how do I do the equivalent of --disable-web-security in Chrome. This has been posted a lot, but never a true answer. Most are links to add-ons (some of which don't work in the latest Firefox or don't work at all) and "you just need to enable support on the server".
This is temporary to test. I know the security implications.
I can't turn on CORS on the server and I especially would never be able to allow localhost or similar.
A flag, or setting, or something would be a lot better than a plugin. I also tried: http://www-jo.se/f.pfleger/forcecors, but something must be wrong since my requests come back as completely empty, but same requests in Chrome come back fine.
Again, this is only for testing before pushing to prod which, then, would be on an allowable domain.
Almost everywhere you look, people refer to the about:config and the security.fileuri.strict_origin_policy. Sometimes also the network.http.refere.XOriginPolicy.
For me, none of these seem to have any effect.
This comment implies there is no built-in way in Firefox to do this (as of 2/8/14).
From this answer I've known a CORS Everywhere Firefox extension and it works for me. It creates MITM proxy intercepting headers to disable CORS.
You can find the extension at addons.mozilla.org or here.
Check out my addon that works with the latest Firefox version, with beautiful UI and support JS regex: https://addons.mozilla.org/en-US/firefox/addon/cross-domain-cors
Update: I just add Chrome extension for this https://chrome.google.com/webstore/detail/cross-domain-cors/mjhpgnbimicffchbodmgfnemoghjakai
The Chrome setting you refer to is to disable the same origin policy.
This was covered in this thread also:
Disable firefox same origin policy
about:config -> security.fileuri.strict_origin_policy -> false
I have not been able to find a Firefox option equivalent of --disable-web-security or an addon that does that for me. I really needed it for some testing scenarios where modifying the web server was not possible.
What did help was to use Fiddler to auto-modify web responses so that they have the correct headers and CORS is no longer an issue.
The steps are:
Open fiddler.
If on https go to menu Tools -> Options -> Https and tick the Capture & Decrypt https options
Go to menu Rules -> Customize rules. Modify the OnBeforeResponseFunction so that it looks like the following, then save:
static function OnBeforeResponse(oSession: Session) {
//....
oSession.oResponse.headers.Remove("Access-Control-Allow-Origin");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
//...
}
This will make every web response to have the Access-Control-Allow-Origin: * header.
This still won't work as the OPTIONS preflight will pass through and cause the request to block before our above rule gets the chance to modify the headers.
So to fix this, in the fiddler main window, on the right hand side there's an AutoResponder tab.
Add a new rule and response:
METHOD:OPTIONS https://yoursite.com/ with auto response: *CORSPreflightAllow
and tick the boxes: "Enable Rules" and "Unmatched requests passthrough".
See picture below for reference:
Best Firefox Addon to disable CORS as of September 2016: https://github.com/fredericlb/Force-CORS/releases
You can even configure it by Referrers (Website).
As of June 2022, Mozilla Firefox does allow you to natively change the CORS configuration. No extra addons are required. As per Mozilla docs you can change the CORS setting by changing the value of the key content.cors.disable
To do so first go to your browser and type about:config in your address bar as shown in the
Click on accept risk and continue, since you are on this stack overflow page we assume you are aware of the risks you are undertaking.
You will see a page with your user variables. On this page just search for key content.cors.disable as
You do not have to type in true or false values, just hit the toggle button at the far right of you in the screen and it will change values.
While the question mentions Chrome and Firefox, there are other software without cross domain security. I mention it for people who ignore that such software exists.
For example, PhantomJS is an engine for browser automation, it supports cross domain security deactivation.
phantomjs.exe --web-security=no script.js
See this other comment of mine: Userscript to bypass same-origin policy for accessing nested iframes
For anyone finding this question while using Nightwatch.js (1.3.4), there's an acceptInsecureCerts: true setting in the config file:
firefox: {
desiredCapabilities: {
browserName: 'firefox',
alwaysMatch: {
// Enable this if you encounter unexpected SSL certificate errors in Firefox
acceptInsecureCerts: true,
'moz:firefoxOptions': {
args: [
// '-headless',
// '-verbose'
],
}
}
}
},
Related
Using Chrome, when I'm trying to change values of an input located in an IFrame of another app on our server, I get an error in Chrome:
"Blocked autofocusing on a form control in a cross-origin subframe."
On production (when the two apps are hosted on the same domain) it's working, but on localhost development I can't make it to work.
I've already tried starting Chrome with the following:
--disable-web-security
--ignore-certificate-errors
--disable-site-isolation-trials
--allow-external-pages
--disable-site-isolation-for-policy
but none worked.
Has anyone has an idea how to make it work?
If any change on server side needed, it's also an option.
For me the issue was a chrome extension ( Dashlane ). I disabled it on that site an it worked. I don't know if this helps you in any way but I had the same issue and this worked for me.
edit: I also had the issue on localhost however haven't tried it yet on server.
Go to chrome://flags
Disable SameSite by default cookies
Relaunch chrome
I have been looking for a way to make sure my web server is secure against a man in the middle attack. It does seem that Google Chrome and Firefox work in blocking requests to my server even if I select to advance after the security warning. I am testing this by using Charles Proxy to intercept Https traffic without having trusted the Charles Cert on my Mac.
When I run the same tests with Safari it will let me through if I chose to ignore the secure warning, which I expect a certain number of users to do. So it seems there is more configuration needed to lock down Safari traffic. I know this is possible because when trying to navigate to github.com with the same scenario I get the following message:
Does anyone know what GitHub is doing to block Safari traffic on an untrusted connection?
Looks like Safari is supporting HSTS and that github is using it. Their HTTP response contains the following header:
Strict-Transport-Security:max-age=31536000; includeSubdomains; preload
This way a browser supporting HSTS knows that for the foreseeable time this site should only be visited with https and any attempts to use http only will automatically be upgraded by the browser.
Apart from basic HSTS which only works after the first visit of the site github also adds the preload directive. This tells browser makers that github likes to be included in the preloaded HSTS list shipped with the browsers, so that the browser applies HSTS even if the site was never visited before by the user. See HSTS Preloading for more information.
I know it's possible to get an empty HTTP_REFERER. Under what circumstances does this happen? If I get an empty one, does it always mean that the user changed it? Is getting an empty one the same as getting a null one? and under what circumstances do I get that too?
It will/may be empty when the enduser
entered the site URL in browser address bar itself.
visited the site by a browser-maintained bookmark.
visited the site as first page in the window/tab.
clicked a link in an external application.
switched from a https URL to a http URL.
switched from a https URL to a different https URL.
has security software installed (antivirus/firewall/etc) which strips the referrer from all requests.
is behind a proxy which strips the referrer from all requests.
visited the site programmatically (like, curl) without setting the referrer header (searchbots!).
HTTP_REFERER - sent by the browser, stating the last page the browser viewed!
If you trusting [HTTP_REFERER] for any reason that is important, you should not, since it can be faked easily:
Some browsers limit access to not allow HTTP_REFERER to be passed
Type a address in the address bar will not pass the HTTP_REFERER
open a new browser window will not pass the HTTP_REFERER, because HTTP_REFERER = NULL
has some browser addon that blocks it for privacy reasons. Some firewalls and AVs do to.
Try this firefox extension, you'll be able to set any headers you want:
#Master of Celebration:
Firefox:
extensions: refspoof, refontrol, modify headers, no-referer
Completely disable: the option is available in about:config under "network.http.sendRefererHeader" and you want to set this to 0 to disable referer passing.
Google chrome / Chromium:
extensions: noref, spoofy, external noreferrer
Completely disable: Chnage ~/.config/google-chrome/Default/Preferences or ~/.config/chromium/Default/Preferences and set this:
{
...
"enable_referrers": false,
...
}
Or simply add --no-referrers to shortcut or in cli:
google-chrome --no-referrers
Opera:
Completely disable: Settings > Preferences > Advanced > Network, and uncheck "Send referrer information"
Spoofing web service:
http://referer.us/
Standalone filtering proxy (spoof any header):
Privoxy
Spoofing http_referer when using wget
‘--referer=url’
Spoofing http_referer when using curl
-e, --referer
Spoofing http_referer wth telnet
telnet www.yoursite.com 80 (press return)
GET /index.html HTTP/1.0 (press return)
Referer: http://www.hah-hah.com (press return)
(press return again)
It will also be empty if the new Referrer Policy standard draft is used to prevent that the referer header is sent to the request origin. Example:
<meta name="referrer" content="none">
Although Chrome and Firefox have already implemented a draft version of the Referrer Policy, you should be careful with it because for example Chrome expects no-referrer instead of none (and I have seen also never somewhere).
BalusC's list is solid. One additional way this field frequently appears empty is when the user is behind a proxy server. This is similar to being behind a firewall but is slightly different so I wanted to mention it for the sake of completeness.
I have found the browser referer implementation to be really inconsistent.
For example, an anchor element with the "download" attribute works as expected in Safari and sends the referer, but in Chrome the referer will be empty or "-" in the web server logs.
click to download
Is broken in Chrome - no referer sent.
I've just got a site running nicely with the whole site running through SSL, but Google Chrome is throwing a "This page contains some insecure elements" message, which isn't good in terms of end user trust-ability. All other browsers work fine, and give the golden padlock.
The site is a Drupal 6 e-commerce site, running on apache2, and the error appears in the front end as well as the admin area.
Does anyone know of any methods to find out exactly which elements are being considered insecure?
Edit: I've used Fiddler to check the traffic, and it really is all HTTPS. It even complains on the site holding page, which is very light and has no javascript etc on it...
It could be a browser issue? Have you tried restarting, or clearing all of your cache?
In Chrome, this is trivial. Hit ctrl+shift+j to open the developer tools, and it will plainly list the URL of the insecure content.
Try it on https://www.fiddler2.com/test/securepageinsecureimage.htm, for instance.
I just had a similar problem. Turns out it was a hardcoded background image URL in a CSS file.
You should particularly check any 3rd party stylesheets you are using, as they may hotlink to an image on another server.
Easy solution? Save those images to your server and change the URLs to relative paths in the CSS file.
Hope this helps!
Search the source for http:? Something like <Ctrl-U> <Ctrl-F> http: in firefox should do.
The insecure element is something loaded over insecure — non-https — connection, e.g. image, stylesheet, etc. you obviously need fully qualified URL to load insecure element/
Use Firebug plugin of Firefox. In the NET tab all file locations are shown clearly. Try to find any files that are obtained from http protocol.
It's probably related to this bug:
http://code.google.com/p/chromium/issues/detail?id=24152
Which is why a restart fixed it.
When I go to our web site through HTTPS mode, Chome is reporting an error saying that the page contains secure and not secure items. However, I used Firebug, Fiddler, and HttpDebuggerPro, all which are telling me that everything is going through HTTPS. Is this a bug in Chrome?
Sorry but I'm unable to give out the actual URL.
A bit late to the party here but I've been having issues recently and once I had found a http resource and changed it was still getting the red padlock symbol. When I closed the tab and opened a new one it changed to a green padlock so I guess Chrome caches this information for the lifetime of the tab
Current versions of Chrome will show the mixed content's URL in the error console. Hit CTRL+Shift+J and you'll see text like:
"The page at https://www.fiddler2.com/test/securepageinsecureimage.htm contains insecure content from http://www.fiddler2.com/Eric/images/me.jpg."
I was having the same issue: Chromium showing the non-secure static files, but when everything was http://.
Just closing the current tab and re-opening the page in another new tab worked, so I think this is a Chromium/Chrome bug.
Cheers,
Diogo
Using Chrome, if you open up the Developer Tools (View > Developer > Developer Tools) and bring up the Console and choose to filter to warnings, you'll see a list of offending URLs.
You'll see something like the following if you do have insecure content
The page at https://mysite/ displayed insecure content from http://insecureurl.
For the best experience in finding the culprit, you'll want to start your investigation in a new tab.
It is possible that a non-secure URL is referenced but not accessed (e.g. the codebase for a Flash <object>).
I ran into this problem when Jquery was being executing a a few seconds after page load which added a class containing a non-secure image background. Chrome must continually to check for any non-secure resources to be loaded.
See the code example below. If you had code like this, the green padlock is shown in Chrome for about 5 seconds until the deferred class is applied to the div.
setTimeout(function() {
$("#some-div").addClass("deferred")
}, 5000);
.deferred
{
background: url(http://not-secure.com/not-secure.jpg"
}
Check the source of the page for any external objects (scripts, stylesheets, images, objects) linked using http://... rather than https://... or a relative path. Change the links to use relative paths, or absolute paths without protocol, i.e. href="/path/to/file".
If all that if fine, it could be something included from Javascript. For example, the Google Analytics code uses document.write to add a new script to the page, but it has code to check for HTTPS in case the calling page is secure:
<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
</script>
On the release of Chrome version 53 on Windows, Google has changed the trust indications to initiate the circle-i. Afterward, Google has announced a new warning message will be issued when a website is not using HTTPS.
From 2017 January Start, Popular web browser Chrome will begin
labeling HTTP sites as “Not Secure” [Which transmit passwords / ask
for credit card details]
If all your resources are indeed secure, then it is a bug. http://code.google.com/p/chromium/issues/detail?id=72015 . Luckily it was fixed.