Is there a way to give permission to change the User-Agent header with a chrome extension using XMLHttpRequest?
I tried to use webRequest.onBeforeSendHeaders, but it doesn't trigger for XMLHttpRequests, which is also mentioned here
Chrome extension: webRequest.onBeforeSendHeaders behaves strange
Sorry if this question was duplicated somewhere else, I'm unable to find any solution though.
Related
I am developing a site, which is still local. I installed this package to add the correct CSP to my site (which I built with Laravel & VueJs).
I have solved them one by one. But now in my chrome console, I don't see any CSP errors anymore. It only shows a blank page. I opened the site in Firefox. There, in the console, I came across the following error:
Content Security Policy: The page’s settings blocked the loading of a
resource at self (“script-src”). Source: call to eval() or related
function blocked by CSP.
I searched the generated bundles for the eval() function, but it doesn't appear in any of the files. Does anyone here have some suggestions for me on how to solve this?
When you're writing the manifest.json file, you have to specify matches for your content scripts. The http and https work fine, but if I try to include chrome://*/* or any variant of it, I get an error that I'm attempting to use an invalid scheme for my matches.
Is it not allowed?
By default you cannot run on a chrome:// url page.
However, there is an option in chrome://flags/#extensions-on-chrome-urls:
Extensions on chrome:// URLs (Mac, Windows, Linux, Chrome OS, Android)
Enables running extensions on chrome:// URLs, where extensions explicitly request this permission.
You still have to specify pages that your extension can run on and wildcards are not accepted - so you have to specify the full URL eg chrome://extensions/
The authorized schemes for matches are http, https, file, ftp.
Therefore, chrome is not a valid scheme.
Yes, it is not allowed. You can't link to them from hrefs on a webpage either.
In Firefox, how do I do the equivalent of --disable-web-security in Chrome. This has been posted a lot, but never a true answer. Most are links to add-ons (some of which don't work in the latest Firefox or don't work at all) and "you just need to enable support on the server".
This is temporary to test. I know the security implications.
I can't turn on CORS on the server and I especially would never be able to allow localhost or similar.
A flag, or setting, or something would be a lot better than a plugin. I also tried: http://www-jo.se/f.pfleger/forcecors, but something must be wrong since my requests come back as completely empty, but same requests in Chrome come back fine.
Again, this is only for testing before pushing to prod which, then, would be on an allowable domain.
Almost everywhere you look, people refer to the about:config and the security.fileuri.strict_origin_policy. Sometimes also the network.http.refere.XOriginPolicy.
For me, none of these seem to have any effect.
This comment implies there is no built-in way in Firefox to do this (as of 2/8/14).
From this answer I've known a CORS Everywhere Firefox extension and it works for me. It creates MITM proxy intercepting headers to disable CORS.
You can find the extension at addons.mozilla.org or here.
Check out my addon that works with the latest Firefox version, with beautiful UI and support JS regex: https://addons.mozilla.org/en-US/firefox/addon/cross-domain-cors
Update: I just add Chrome extension for this https://chrome.google.com/webstore/detail/cross-domain-cors/mjhpgnbimicffchbodmgfnemoghjakai
The Chrome setting you refer to is to disable the same origin policy.
This was covered in this thread also:
Disable firefox same origin policy
about:config -> security.fileuri.strict_origin_policy -> false
I have not been able to find a Firefox option equivalent of --disable-web-security or an addon that does that for me. I really needed it for some testing scenarios where modifying the web server was not possible.
What did help was to use Fiddler to auto-modify web responses so that they have the correct headers and CORS is no longer an issue.
The steps are:
Open fiddler.
If on https go to menu Tools -> Options -> Https and tick the Capture & Decrypt https options
Go to menu Rules -> Customize rules. Modify the OnBeforeResponseFunction so that it looks like the following, then save:
static function OnBeforeResponse(oSession: Session) {
//....
oSession.oResponse.headers.Remove("Access-Control-Allow-Origin");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
//...
}
This will make every web response to have the Access-Control-Allow-Origin: * header.
This still won't work as the OPTIONS preflight will pass through and cause the request to block before our above rule gets the chance to modify the headers.
So to fix this, in the fiddler main window, on the right hand side there's an AutoResponder tab.
Add a new rule and response:
METHOD:OPTIONS https://yoursite.com/ with auto response: *CORSPreflightAllow
and tick the boxes: "Enable Rules" and "Unmatched requests passthrough".
See picture below for reference:
Best Firefox Addon to disable CORS as of September 2016: https://github.com/fredericlb/Force-CORS/releases
You can even configure it by Referrers (Website).
As of June 2022, Mozilla Firefox does allow you to natively change the CORS configuration. No extra addons are required. As per Mozilla docs you can change the CORS setting by changing the value of the key content.cors.disable
To do so first go to your browser and type about:config in your address bar as shown in the
Click on accept risk and continue, since you are on this stack overflow page we assume you are aware of the risks you are undertaking.
You will see a page with your user variables. On this page just search for key content.cors.disable as
You do not have to type in true or false values, just hit the toggle button at the far right of you in the screen and it will change values.
While the question mentions Chrome and Firefox, there are other software without cross domain security. I mention it for people who ignore that such software exists.
For example, PhantomJS is an engine for browser automation, it supports cross domain security deactivation.
phantomjs.exe --web-security=no script.js
See this other comment of mine: Userscript to bypass same-origin policy for accessing nested iframes
For anyone finding this question while using Nightwatch.js (1.3.4), there's an acceptInsecureCerts: true setting in the config file:
firefox: {
desiredCapabilities: {
browserName: 'firefox',
alwaysMatch: {
// Enable this if you encounter unexpected SSL certificate errors in Firefox
acceptInsecureCerts: true,
'moz:firefoxOptions': {
args: [
// '-headless',
// '-verbose'
],
}
}
}
},
I am working on a Chrome extension that needs to inject scripts into data:-URI pages.
When trying to execute the javascript I get an exception:
Error during tabs.executeScript: Cannot access contents of url "data:text/html;charset=utf-8, … ". Extension manifest must request permission to access this host.
But which permission would work for me? I tried data:*, <all_urls>, *://*/* - none of these worked. Also the activeTab permission did not do the trick. Any ideas?
It's currently a chromium bug that extensions cannot work on data URIs. A fix is going in that will rectify this, hopefully landing in Chrome 66.
Traditionally, I would inspect the Akamai headers by installing a Firefox extension called akamaiheaders.xpi. Unfortunately, I think the last version of Firefox to support this was 3.
As I understand it, this plugin would add special headers to all HTTP requests that Firefox made, which would prompt Akamai to add a bunch of headers to the response (telling me whether the file was cached, where it got it from, etc.). Then, using a tool like HTTPFox or Firebug, I could easily see which assets were cached and which ones were not.
I've searched all over, but I can't find anything as simple and easy to use as that. Does anyone know of anything out there that allows me to track all the Akamai headers for all the assets my browser loads that works in either FF, Chrome, or Safari?
You can use curl and/or wget for this:
curl -H "Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" -IXGET http://www.oxfordpress.com/
or
wget -S -O /dev/null --header="Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no" http://www.oxfordpress.com/
If you want to test staging environment, you need to remember to send Host header, eg:
curl -H "Host: www.oxfordpress.com" -H "Pragma: ..." -IXGET http://oxfordpress.com.edgesuite-staging.net/
This way or another, it's always about sending proper Pragma headers and then reading response headers.
List of Pragma headers as well as explanations for X-Cache response header can be found here: http://webspherehelp.blogspot.com/2009/07/understanding-akamai-headers-to-debug.html.
I know this question is old, but since I came across it in my search today I thought I'd add an answer for the next person who comes along.
There are a couple of extensions in the Chrome store for this now:
Akamai debug headers which just adds headers to your net panel in web inspector
Exceda Akamai Headers Extension which seems to also work for purging cache.
Akamai debug headers is the one I chose and it's working well so far.
You can use a local proxy (e.g. Fiddler or Charles Proxy, my personal favorite) and add the following header to outgoing requests:
Pragma: akamai-x-cache-on, akamai-x-cache-remote-on, akamai-x-check-cacheable, akamai-x-get-cache-key, akamai-x-get-extracted-values, akamai-x-get-nonces, akamai-x-get-ssl-client-session-id, akamai-x-get-true-cache-key, akamai-x-serial-no
If you're using Chrome or Chromium, you can use the extensions Header Hacker or Pragma Header. With either one, you will be have to add Pragmas manually.
If you can find the akamaiheader.xpi file, you can just open it and change the maxVersion in install.rdf to 9.*
.xpi files are just ZIP files, and on most machines you can just add .zip to the filename and doubleclick on it.
To debug akamai headers, for the Chrome browser, try this extension: CDN Headers & Cookies - Chrome Web Store
https://chrome.google.com/webstore/detail/cdn-headers-cookies/obldlamadkihjlkdjblncejeblbogmnb
Note: Enable 'Load Akamai Headers' in the settings (click the 'Lego minifig Head' icon, click the gear, and check on 'Load Akamai Headers').
It has been suggested on the Akamai community.
https://community.akamai.com/community/web-performance/blog/2015/03/31/using-akamai-pragma-headers-to-investigate-or-troubleshoot-akamai-content-delivery
They have a new version of the XPI out which you can download in Luna. There's also another Plugin which adds a 'content source' pane into Firebug for a quick reference of what on the page was Akamaised.
As I say, to download both plugins you need to login to Luna and look under 'Support' > 'More Tools' > 'Browser Extensions'. The XPI isn't publicly accessible.
YMMV but as far as I recall being told by colleagues the Exceda plugin duplicated HTTP requests which can be a bit messy whilst debugging.
For Chrome I find ModHeader + Setting up a profile where the Pragma headers are sent works fine.