How to disable CSP (content security policy) for UIWebview and WKWebview - uiwebview

We want to run some JavaScript for the web pages that loaded in our webview, but some of them enable CSP block JS. Is there anyway we can disable CSP in UIWebview and WKWebview?

It seems that there is no way to turn the Content Security Policy off or bypass it through faking a response header.
Although, even if something like this was possible, Apple could possibly deny the submission to the store as it could be a potential security violation.

Related

X-Frame Options Not 100% For Blocking Iframes

It seems setting X-Frame-Options to deny is the most recommended method to prevent my website from being Iframed. But it is not perfect. First off X-Frame-Options can be ignored by using a Chrome extension as discussed in the below post. I have proved this is the case by using the Ignore X-Frame headers chrome extension.
Getting around X-Frame-Options DENY in a Chrome extension?
Secondly X-Frame-Options deny only works on the first iframe of a web page, if I iframe a web page twice the second iframe works.
My question is, what is the best multipronged approach to prevent my website from being iframed?
X-Frame-Options is just a response header. Of course if you control the client, you can ignore it. If you control the client, you can do pretty much anything.
The point in X-Frame-Options is to prevent attacks like Clickjacking (primarily) or Pixel Perfect Timing Attacks for example. It does indeed prevent those attacks, because the attacker cannot control the victim's browser to say install an extension (or if he can, Clickjacking is the least issue from the victim's perspective :) ).

Blocking Chrome Extensions from running on my site

As a web developer, is there any way to prevent a user's Chrome extensions from being applied to my site? i.e. a header, meta tag, anything? Additionally, if there is, is there also a way to whitelist particular extensions?
It's not possible. At the web server end, you are only only able to control what the browser will allow you to control. In simple terms, this means you can control the data (HTML, javascript, headers etc) that you send back to it. That's about it.
Can't you create a Content Security Policy (CSP) and block inline javascript and only allow javascript from specific domains? You could even create a CSP in report-only mode and collect violation reports via something like https://report-uri.io/

Can I enable cross site scripting in IE for any/all websites by simply disabling XSS filter- or is something more required?

I want to enable cross site scripting for some sites. Specifically, I want to submit a web form to a 3rd party site, I have set the target iframe for the web form's response to a child iframe, now I want my code in main window to retrieve content of the response web page.
Am I correct in assuming that I can do the above by simply disabling the XSS Filter in Internet Explorer? Or is something else also required? Also how do I enable cross site scripting in Firefox (for the same scenario?)
IE's XSS filter does not protect against DOM Based XSS. So in order to execute a javascript payload across domains you can simply run eval() the request variable or use document.write().
A more common approach is to use a "cross-domain proxy". Mainly because this approach usually doesn't introduce a gaping vulnerability which could be used to attack your users.
Am I correct in assuming that I can do the above by simply disabling the XSS Filter in Internet explorer?
No. The XSS filter isn't responsible for this limitation.
Or is something else also required for the same?
There is no way to enable cross-origin requests from within a web page -- restrictions on cross-origin requests are a fundamental part of the web security model. However, for development, you can launch Google Chrome with the --disable-web-security flag to allow you to make cross-origin requests. This can be helpful in development of web applications where the application must access an API hosted on another domain.

Firefox or Chrome plugin to block and filter all outgoing connections

In Firefox or Chrome I'd like to prevent a private web page from making outgoing connections, i.e. if the URL starts with http://myprivatewebpage/ or https://myprivatewebpage/ in a browser tab, then that browser tab must be restricted so that it is allowed to load images, CSS, fonts, JavaScript, XmlHttpRequest, Java applets, flash animations and all other resources only from http://myprivatewebpage/ or https://myprivatewebpage/, i.e. an <img src="http://www.google.com/images/logos/ps_logo.png"> (or the corresponding <script>new Image(...) must not be able to load that image, because it's not on myprivatewebpage. I need a 100% and foolproof solution: not even a single resource outside myprivatewebpage can be accessible, not even at low probability. There must be no resource loading restrictions on Web pages other than myprivatewebpage, e.g. http://otherwebpage/ must be able to load images from google.com.
Please note that I assume that the users of myprivatewebpage are willing to cooperate to keep the web page private unless it's too much work for them. For example, they would be happy to install a Chrome or Firefox extension once, and they wouldn't be offended if they see an error message stating that access is denied to myprivatewebpage until they install the extension in a supported browser.
The reason why I need this restriction is to keep myprivatewebpage really private, without exposing any information about its use to webmasters of other web pages. If http://www.google.com/images/logos/ps_logo.png was allowed, then the use of myprivatewebpage would be logged in the access.log of Google's ps_logo.png, so Google's webmasters would have some information how myprivatewebpage is used, and I don't want that. (In this question I'm not interested in whether the restriction is reasonable, but I'm only interested in the technical solutions and its strengths and weaknesses.)
My ideas how to implement the restriction:
Don't impose any restrictions, just rely on the same origin policy. (This doesn't provide the necessary protection, the same origin policy lets all images pass through.)
Change the web application on the server so it generates HTML, JavaScript, Java applets, flash animations etc. which never attempt to load anything outside myprivatewebpage. (This is almost impossibly hard to foolproof everywhere on a complicated web application, especially with user-generated content.)
Over-sanitize the web page using a HTML output filter on the server, i.e. remove all <script>, <embed> and <object> tags, restrict the target of <img src=, <link rel=, <form action= etc. and also restrict the links in the CSS files. (This can prevent all unwanted resources if I can remember all HTML tags properly, e.g. I mustn't forget about <video>. But this is too restrictive: it removes all dyntamic web page functionality like JavaScript, Java applets and flash animations; without these most web applications are useless.)
Sanitize the web page, i.e. add an HTML output filter into the webserver which removes all offending URLs from the generated HTML. (This is not foolproof, because there can be a tricky JavaScript which generates a disallowed URL. It also doesn't protect against URLs loaded by Java applets and flash animations.)
Install a HTTP proxy which blocks requests based on the URL and the HTTP Referer, and force all browser traffic (including myprivatewebpage, otherwebpage, google.com) through that HTTP proxy. (This would slow down traffic to other than myprivatewebpage, and maybe it doesn't protect properly if XmlHttpRequest()s, Java applets or flash animations can forge the HTTP Referer.)
Find or write a Firefox or Chrome extension which intercepts all outgoing connections, and blocks them based on the URL of the tab and the target URL of the connection. I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
Some links I've found for the Chrome extension:
http://www.chromium.org/developers/design-documents/extensions/notifications-of-web-request-and-navigation
https://groups.google.com/a/chromium.org/group/chromium-extensions/browse_thread/thread/90645ce11e1b3d86?pli=1
http://code.google.com/chrome/extensions/trunk/experimental.webRequest.html
As far as I can see, only the Firefox or Chrome extension is feasible from the list above. Do you have any other suggestions? Do you have some pointers how to write or where to find such an extension?
I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
I am the author of the latter extension, though I have yet to update it to support newer versions of Firefox. My initial guess is that, yes, it will do what you want:
User visits your web page without plugin. Web page contains ThinkAhead block that would send a simple version header to the server, but this is ignored as plugin is not installed.
Since the server does not see that header, it redirects the client to a page to install the plugin.
User installs plugin.
User visits web page with plugin. Page sends version header to server, so server allows access.
The ThinkAhead block matches all pages that are not myprivatewebpage, and does something like set the HTTP status to 403 Forbidden. Thus:
When the user visits any webpage that is in myprivatewebpage, there is normal behaviour.
When the user visits any webpage outside of myprivatewebpage, access is denied.
If you want to catch bad requests earlier, instead of modifying incoming headers, you could modify outgoing headers, perhaps screwing up "If-Match" or "Accept" so that the request is never honoured.
This solution is extremely lightweight, but might not be strong enough for your concerns. This depends on what you want to protect: given the above, the client would not be able to see blocked content, but external "blocked" hosts might still notice that a request has been sent, and might be able to gather information from the request URL.

Setting cookie in iframe - different Domain

We have our site integrated as an iframe into another site that runs on a different domain. It seems that we cannot set cookies. Has anybody encountered this issue before? Any ideas?
Since your content is being loaded into an iframe from a remote domain, it is classed as a third-party cookie.
The vast majority of third-party cookies are provided by advertisers (these are usually marked as tracking cookies by anti-malware software) and many people consider them to be an invasion of privacy. Consequently, most browsers offer a facility to block third-party cookies, which is probably the cause of the issue you are encountering.
From new update of Chromium in February 4, 2020 (Chrome 80).
Cookies default to SameSite=Lax. According to this link.
To fix this, you just need to mark your cookies are SameSite=None and Secure.
To understand what is Samesite cookies, please see this document
After reading through Facebook's docs on iframe canvas pages, I figured out how to set cookies in iframes with different domains. I created a proof of concept sinatra application here: https://github.com/agibralter/iframe-widget-test
There is more discussion on how Facebook does it here: How does Facebook set cross-domain cookies for iFrames on canvas pages?
IE requires you to set a P3P policy before it will allow third-party frames to set cookies, under the default privacy settings.
Supposedly P3P allows the user to limit what information goes to what parties who promise to handle it in certain ways. In practice it's pretty much worthless as users can't really set any meaningful limitations on how they want information handled; in the end it's just a fairly uniform setting acting as a hoop that all third parties have to jump through, saying “I'll be nice with your personal information” even if they have no intention of doing so.
Despite adding SameSite=None and Secure in the cookie, you might not see the cookie being sent in the request. This might be because of the browser settings. e.g, on Brave, you have to explicity disable it.
As more and more people are switching to Brave or block third party cookies using browser extensions, you should not rely on this mechanism.

Resources