Can I stop the "this website does not supply identity information" message - security

I have a site which is completely on https: and works well, but, some of the images served are from other sources e.g. ebay, or Amazon.
This causes the browser to prompt a message: "this website does not supply identity information"
How can I avoid this? The images must be served from elsewhere sometimes.

"This website does not supply identity information." is not only about the encryption of the link to the website itself but also the identification of the operators/owners of the website - just like it actually says. For that warning (it's not really an error) to stop, I believe you have to apply for the Extended Validation Certificate https://en.wikipedia.org/wiki/Extended_Validation_Certificate. EVC rigorously validates the entity behind the website not just the website itself.

Firefox shows the message
"This website does not supply identity information."
while hovering or clicking the favicon (Site Identity Button) when
you requested a page over HTTP
you requested a page over HTTPS, but the page contains mixed passive content
HTTP
HTTP connections generally don't supply any reliable identity information to the browser. That's normal. HTTP was designed to simply transmit data, not to secure the data it transmits.
On server side you could only avoid that message, if the server would start using a SSL certificate and the code of the page would be changed to exclusively use HTTPS requests.
To avoid the message on client side you could enter about:config in the address bar, confirm you'll be careful and set browser.chrome.toolbar_tips = false.
HTTPS, mixed passive content
When you request a page over HTTPS from a site which is using a SSL certificate, the site does supply identity information to the browser and normally the message wouldn't appear.
But if the requested page embeds at least one <img>, <video>, <audio> or <object> element which includes content over HTTP (which won't supply identity information), than you'll get a so-called mixed passive content * situation.
Firefox won't block mixed passive content by default, but only show said message to warn the user.
To avoid this on server side, you'd first need to identify which requests are producing mixed content.
With Firefox on Windows you can use Ctrl+Shift+K (Control-Option-K on Mac) to open the web console, deactivate the css, js and security filters, and press F5 to reload the page, to show all the requests of the page.
Then fix your code for each line which is showing "mixed content", i.e. change the appropriate parts of your code to use https:// or, depending on your case, protocol-relative URLs.
If the external site an element is requested from doesn't use a SSL certificate, the only chance to avoid the message would be to copy the external content over to your site so your code can refer to it locally via HTTPS.
* Firefox also knows mixed active content, which is blocked by default, but that's another story.

Jürgen Thelen's answer is absolutely correct. If the images (quite often the case) displayed on the page are served over "http" the result will be exactly as described no matter what kind of cert you have, EV or not. This is very common on e-commerce sites due to the way they're constructed. I've encountered this before on my own site AND CORRECTED IT by simply making sure that no images have an "http" address - and this was on a site that did not have an EV cert. Use the ctrl +shift +K process that Jürgen describes and it will point you to the offending objects. If the path to an object is hard coded and the image resides on your server (not called from somewhere else) simply remove the "http://servername.com" and change it to a relative path instead. Correct that and the problem will go away. Note that the problem may be in one of the configuration files as well, such as one of the config.php files.
The real issue is that Firefox's error message is misleading and has nothing to do with whether the SSL is an EV cert or not. It really means there is mixed content on the page but doesn't say that. A couple of weeks ago I had a site with the same problem and Firefox displayed the no-identity message. Chrome, however, (which I normally don't use) displayed an exclamation mark instead of a lock. I clicked on it and it said the cert was valid (with a green dot), it was a secure connection (another green dot), AND had "Mixed Content. The site includes HTTP resources" which was entirely accurate and the source of the problem (with a red dot). Once the offending paths were changed to relative paths, the error messages in both Firefox and Chrome disappeared.

For me, it was a problem of mixed content. I forced everything to make HTTPS requests on the page and it fixed the problem.
For people who come here from Google search, you can use Cloudflare's (free) page rules to accomplish this without touching your source code. Use the "Always use HTTPS" setting for your domain.

You can also transfrom http links to https links using url shortener www.tr.im. That is the only URL-shortener I found that provides shorter links through https.
You just have to change it manually from http://tr.im/xxxxxx to https://tr.im/xxxxxx.

Related

Behavior of JS/CSS from unsecured CDN

When using JS/CSS from unsecured CDN in https page,
A. Some pages block loading js/css, and cause runtime error by short of js code.
B. Some pages do not block loading js/css, pages are shown as entirely insecure contents.
What is the difference of these behaviors?
Even if using same browser (I'm using Chrome 51.0.2704.103 (64-bit) in Mac OS X) and seeing same page, behavior changes sometimes...
May some response headers of index.html or so control this behavior?
Anyone know about this?
Example:
My friend create page https://cfn-iot-heatmap.herokuapp.com/, in before, this page's behavior was like A, contents are totally white out.
In this case, insecure CDN contents are:
https://cdn.leafletjs.com/leaflet-0.6.4/leaflet.js
https://cdn.leafletjs.com/leaflet-0.6.4/leaflet.css
I got source codes of this page and deployed to my heroku repository https://kinkyujitai.herokuapp.com/, it is shown like B.
But curious, after I deployed my repository, friend's repository also works like B, showing security warning but shown.
It is very curious, so I want to know the reason of this phenomena...
From a secure (https) origin, you should always include secure elements.
If you don't, browser can block insecure request and/or remove the visual indication of the security.

The site uses SSL, but Google Chrome has detected insecure content on the page

I'm using SSL on my website and it is giving me the lock with yellow triangle icon ("The site uses SSL, but Google Chrome has detected insecure content on the page.")
On clicking the lock icon it says:
Your connection to domainname is encrypted with 256-bit encryption. However, this page includes other resources which are not secure. These resources can be viewed by others while in transit, and can be modified by an attacker to change the look of the page. The connection uses TLS 1.0. The connection is encrypted using AES_256_CBC, with SHA1 for message authentication and DHE_RSA as the key exchange mechanism. The connection is not compressed.
How do I ensure I get the green lock?
You must have resources (images, stylesheets, scripts, etc...) which are embedded on the page but are not served over https. Make sure all your resources are served over https, and that warning should go away.
I had the same problem and it occoured because I included a script from Google Analytics using HTTP.
With a provider like Google, one can simply change HTTP to HTTPS - and it will work. This will not work with all providers.
If you are trying to load something from a website that you own, you will have to secure that website with HTTPS.
Google Chrome will detect this and automatically not load the insecure content (from the HTTP domain) which may take away some functionality from the website.
Certain AV/Malware softwares will also detect this and give a security warning which may frighten your visitors away.
If you are using Google Chrome, then you might not notice such a warning, because the AV/Malware software never sees this HTTP-link because it is blocked by Google Chrome.
And if you do not have the kind of AV/Malware software that detects this then you may never notice such a warning while the visitors are.
What you must do is:
Install Google Chrome and go to the website.
Click on "Tools >> JavaScript Console" and see if any warning appears. (this has been commented by Brad Koch on the question as well)
Go throught the different pages on your website and see if any errors appear - if so, then go change the URLs to HTTPS (if this is possible) or find another provider for this javascript.
make sure all references to resources such as images, js files, css files, ads, etc are served through https. If the uri to the resource is relative, e.g. /images/logo.png, then the resource is fetched from the same host and port and protocol as the page itself, in your case https. I would use fiddler to find what files get fetched over http:// when the page is loaded.
This is what I get when I go to TOOLS and then click on JAVASCRIPT CONSOLE ::
Failed to load resource chrome://thumb/https://accounts.google.com/ServiceLogin?service=chromiumsyn...
s%3A%2F%2Fwww.google.com%2Fintl%2Fen-US%2Fchrome%2Fblank.html%3Fsource%3D1
Failed to load resource chrome://thumb/http://www.xe.com/
What do I do after this?

Firefox or Chrome plugin to block and filter all outgoing connections

In Firefox or Chrome I'd like to prevent a private web page from making outgoing connections, i.e. if the URL starts with http://myprivatewebpage/ or https://myprivatewebpage/ in a browser tab, then that browser tab must be restricted so that it is allowed to load images, CSS, fonts, JavaScript, XmlHttpRequest, Java applets, flash animations and all other resources only from http://myprivatewebpage/ or https://myprivatewebpage/, i.e. an <img src="http://www.google.com/images/logos/ps_logo.png"> (or the corresponding <script>new Image(...) must not be able to load that image, because it's not on myprivatewebpage. I need a 100% and foolproof solution: not even a single resource outside myprivatewebpage can be accessible, not even at low probability. There must be no resource loading restrictions on Web pages other than myprivatewebpage, e.g. http://otherwebpage/ must be able to load images from google.com.
Please note that I assume that the users of myprivatewebpage are willing to cooperate to keep the web page private unless it's too much work for them. For example, they would be happy to install a Chrome or Firefox extension once, and they wouldn't be offended if they see an error message stating that access is denied to myprivatewebpage until they install the extension in a supported browser.
The reason why I need this restriction is to keep myprivatewebpage really private, without exposing any information about its use to webmasters of other web pages. If http://www.google.com/images/logos/ps_logo.png was allowed, then the use of myprivatewebpage would be logged in the access.log of Google's ps_logo.png, so Google's webmasters would have some information how myprivatewebpage is used, and I don't want that. (In this question I'm not interested in whether the restriction is reasonable, but I'm only interested in the technical solutions and its strengths and weaknesses.)
My ideas how to implement the restriction:
Don't impose any restrictions, just rely on the same origin policy. (This doesn't provide the necessary protection, the same origin policy lets all images pass through.)
Change the web application on the server so it generates HTML, JavaScript, Java applets, flash animations etc. which never attempt to load anything outside myprivatewebpage. (This is almost impossibly hard to foolproof everywhere on a complicated web application, especially with user-generated content.)
Over-sanitize the web page using a HTML output filter on the server, i.e. remove all <script>, <embed> and <object> tags, restrict the target of <img src=, <link rel=, <form action= etc. and also restrict the links in the CSS files. (This can prevent all unwanted resources if I can remember all HTML tags properly, e.g. I mustn't forget about <video>. But this is too restrictive: it removes all dyntamic web page functionality like JavaScript, Java applets and flash animations; without these most web applications are useless.)
Sanitize the web page, i.e. add an HTML output filter into the webserver which removes all offending URLs from the generated HTML. (This is not foolproof, because there can be a tricky JavaScript which generates a disallowed URL. It also doesn't protect against URLs loaded by Java applets and flash animations.)
Install a HTTP proxy which blocks requests based on the URL and the HTTP Referer, and force all browser traffic (including myprivatewebpage, otherwebpage, google.com) through that HTTP proxy. (This would slow down traffic to other than myprivatewebpage, and maybe it doesn't protect properly if XmlHttpRequest()s, Java applets or flash animations can forge the HTTP Referer.)
Find or write a Firefox or Chrome extension which intercepts all outgoing connections, and blocks them based on the URL of the tab and the target URL of the connection. I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
Some links I've found for the Chrome extension:
http://www.chromium.org/developers/design-documents/extensions/notifications-of-web-request-and-navigation
https://groups.google.com/a/chromium.org/group/chromium-extensions/browse_thread/thread/90645ce11e1b3d86?pli=1
http://code.google.com/chrome/extensions/trunk/experimental.webRequest.html
As far as I can see, only the Firefox or Chrome extension is feasible from the list above. Do you have any other suggestions? Do you have some pointers how to write or where to find such an extension?
I've found https://developer.mozilla.org/en/Setting_HTTP_request_headers and thinkahead.js in https://addons.mozilla.org/en-US/firefox/addon/thinkahead/ and http://thinkahead.mozdev.org/ . Am I correct that it's possible to write a Firefox extension using that? Is there such a Firefox extension already?
I am the author of the latter extension, though I have yet to update it to support newer versions of Firefox. My initial guess is that, yes, it will do what you want:
User visits your web page without plugin. Web page contains ThinkAhead block that would send a simple version header to the server, but this is ignored as plugin is not installed.
Since the server does not see that header, it redirects the client to a page to install the plugin.
User installs plugin.
User visits web page with plugin. Page sends version header to server, so server allows access.
The ThinkAhead block matches all pages that are not myprivatewebpage, and does something like set the HTTP status to 403 Forbidden. Thus:
When the user visits any webpage that is in myprivatewebpage, there is normal behaviour.
When the user visits any webpage outside of myprivatewebpage, access is denied.
If you want to catch bad requests earlier, instead of modifying incoming headers, you could modify outgoing headers, perhaps screwing up "If-Match" or "Accept" so that the request is never honoured.
This solution is extremely lightweight, but might not be strong enough for your concerns. This depends on what you want to protect: given the above, the client would not be able to see blocked content, but external "blocked" hosts might still notice that a request has been sent, and might be able to gather information from the request URL.

How to identify mixed content in https website

I've inherited an ASP.NET web site that has an SSL certificate bought via GoDaddy.
The problem is that the certificate seems to be invalid because of some "mixed content/resources" (I think that's how its called) coming from http sites.
Chrome is showing the red cross over the lock next to https, meaning it's unsecured. The popups says the following:
Click in "What do these mean?" goes here which says:
The [crossed-lock] icon appears when
Google Chrome detects high-risk mixed
content, such as JavaScript, on the
page or when the site presents an
invalid certificate.
The certificate is correct and valid because I tried creating a blank "Hi world" .aspx page and it's showing the green lock with no problems.
Reading a little bit, I found that I should only include images and javascript coming from https sites. The only thing it had coming from http was the addthis widget, but they support https, so I changed to https, but it's still saying that is unsecured.
I've searched for anything else coming from http in the source, but didn't find anything.
Is there some way (site, chrome extension, firefox extension, whatever) that will show exactly which are the resources that are "unsecured"?
I've never dealt with SSL/HTTPS certificates, but I need to fix this issue asap.
Check your site in http://www.whynopadlock.com, which will give you a list of url which is not consider as secure by your browser.
Check the chrome console
You will get it like this,
The page at https://xys displayed insecure content from http://asdasda.png.
Make it http site to https then it will work.
I've found the problem using the Chrome Developer Tools. It was a js that's embedding a flash from an 3rd party site which it's using http.
Are you on Windows? Download and run Fiddler while browsing the site, and watch for HTTP connections.
Mixed content means contents of a web page are mixed with HTTP and HTTPS links.
These links include your JS, CSS, Image, Video, Audio, Iframes etc.
If your website is enabled for HTTPS (SSL certificate has installed), make sure you serve only HTTPS contents throughout your web page.

How to detect which content is not secured on mixed content SSL page.?

I've added a SSL certificate to an existing site, and now in IE I get a mixed content warning. Problem is, I don't know what's the non-secure content IE is warning me about. It's a simple html page, with a few Flash, a few images, a loaded CSS and JS.
How can I find out what's the non-secured content..?
Edit:
I found the culprit: it's the JS AC_RunActiveContent.js used to display Flash movie. So anyone has an idea on how to prevent SSL mixed content when using AC_RunActiveContent.js.?
This means that something is requesting content using the http protocol specifically, or you have an absolute path to an image or other content that begins with http instead of https.
A few tips: Use relative paths everywhere you can. If you must use an absolute path, and it's to a server you own, use https. If you're loading stuff from off your site, you're probably stuck with the mixed-content warning.
This also goes for your scripts, check out the JS, and the CSS template and make sure they're not the guilty parties - if they are change them to use relative paths, or to request items via https instead of http (assuming you're positive that the server they're referencing supports https, if it doesn't you're stuck).
There are a few other details, this might be helpful.
Ok, so here is the solution for my particular problem. It was the codebase value in my code that needed to be https as well (I didn't think it would trigger the warning, as my Flash were displaying correctly, oh well)...
AC_FL_RunContent( 'codebase','https://download.macromedia.com/pub/shoc...
Link to Adobe info on this: Security Information error in Internet Explorer
I use the Firefox console -- it reports the http resources it blocks from fetching on a mixed content page.
Search your source for http: only. Another great tool to help you out is Fiddler with which you can see what's getting downloaded upon requesting your page.

Resources