IE's security alert removal - security

I have a site that can be accessed both through http (http://mysite.com) and https (https://mysite.com). The https version holds secured content, while the http content is for public use. Both contents are on the same server.
Some of the https pages contain some elements such as images that are hosted on the http pages. So when one goes to the https site, IE's security alert pops up saying that the content required contains nonsecure data. knowing that there is no risk anyway, I want to stop that popup. Is this something to be done only through the IE's settings on the client side or do I need to do something about the SSL certificate and configurations? Any guide is highly appreciated.

IE isn't the only browser that will give a popup of that nature. From memory, Firefox and Chrome have similar warnings (like they remove the padlock, or make the https go red in the address bar, instead of green).
The only way you can get it to go away is to only reference https resources in https pages. Have you checked to see what you're referring to doesn't have a secure version? All of the tools I have used generally have an equivalent https:// domain.

Please understand that the alert is there for a reason. The SSL certificate in place protects against man-in-the-middle attacks*. If you load in resources from non-https sources, then the man-in-the-middle protection you otherwise would have, is lost. The user's data may still be encrypted, but it doesn't really matter if the user is sending all his data to an attacker's computer that is decrypting it itself!
You need to remember that HTTPS is an all-or-nothing scheme. As soon as you introduce a non-HTTPS element into your page, you have essentially lost all the security that SSL has to provide.
Please mount your resources (or somehow make them available) on both HTTP and HTTPS URLs and load them accordingly. If you don't you are putting your users at unnecessary risk.
*only if you have a fully valid SSL certificate.

Related

if i use http for part of my website and https for another part does this open up any security issues

I have a node.js app.
I have it configured to redirect everything to https from http.
but i was thinking if the extra work to make the normal pages visible on http and the logged in pages only visible via https, would be worth the effort.
does having both in my app expose any security holes?
Yes multiple, including:
Cookies are shared between the two sites unless you remember to include the "secure" attribute each time you set a cookie.
You are vulnerable to MITM attacks (e.g. replacing a "login" link on http to either keep you on http or redirect you to another site instead).
Resources need to be loaded over https on the secure site or you will get mixed security warnings. It's easy to miss this when running mixed sites.
Users will not know whether pages should be secure or not.
Can forget to renew cert and/or see cert errors but this should be more obvious if whole site is https.
Cannot use advanced security features like HSTS.
And that's just off the top of my head.
Go https everywhere and redirect all http traffic to https. Unless you've a good reason not to.
There are other benefits too (user confidence, looks more professional, small SEO boast, Google sees this as two sites, easier management of sites, Chrome will soon block access to some features like location tracking on http, cannot upgrade to HTTP/2 until you implement https... etc.).

What's the point of the X-Frame-Options header?

I work on an application where users can embed their website within surrounding content by loading it in an iframe. This obviously relies on the X-Frame-Options not being set on the users website to work. I was asked by a client to create a reverse proxy because they didn't want to remove the X-Frame-Options header from their site for security concerns.
I setup the proxy and everything works but what's the point of the X-Frame-Options header if its as simple as creating a proxy to circumvent?
I understand the header exists to prevent clickjacking but if anyone can just make a proxy to workaround it... does it really increase security?
I don't come from the enterprise dev world, can you help me understand the reasoning behind why the IT department would be resistant to removing the header?
I noticed google.com and facebook.com also set the header, so it can't be completely pointless can it?
Thanks
Any site served over http can have its content altered by using a proxy for example. So yes this is fairly pointless on http sites since it's so easily defeated.
Serving a site over https prevents this unless you have a proxy server which also intercepts https traffic. This is only possible by the proxy acting as a man-in-the-middle (MITM) so it decrypts the traffic at the proxy and then re-encrypts the traffic to send on to the server and same in way back. For this to work the proxy server either needs to know the server private key or, more likely, replaces the cert presented to its client with its own copy.
While MITM is usually associated with attacks there are some legitimate scenarios (though many argue even these are not legitimate and https should be secure!):
Anti-virus software can do this to scan requests to protect your computer. If you run Avast for example and have SSL scanning turned on (think it's on by default) and go to https://www.google.com and look at the cert you will notice it's been issued by Avast instead of by Google as usual. To do this requires the antivirus software to have installed an issuer certificate on your PC from which it can issue these replacement certs which your browser will still accept as real certs. Installing this issuer cert requires Admin access which you temporarily give when installing the anti-virus software.
Corporate proxies do a similar process to allow them to monitor https traffic from its employees. Again it requires an issuer installed on the PC using admin rights.
So basically it's only possible to use a proxy like you suggest for https traffic if you already have, or have had in the past, Admin rights in the PC - at which case all bets are off anyway.
The only other way to do this is to keep traffic on http using a proxy. For example if you request www.google.com then this normally redirects to https://www.google.com but your proxy can intercept that redirect request and instead keep the client->proxy connection on https, allowing the proxy to amend the request to strip out headers. This depends on the users not typing https, not noticing there is no green padlock and can be defeated with technologies like HSTS (which is automatically preloaded in some browsers for some sites like google.com). So not really reliable way to intercept traffic.
Many secure sites use X-Frame-Options to prevent clickjacking
Clickjacking
Clickjacking Defense Cheat Sheet
This prevents attackers from tricking users through transparent layers from performing actions they are unaware of on sites they didn't even know they loaded. Furthermore, this attack only works with frames that are directly served from the attacked/victim site's domain in the user's browser.
You may think that you can just reverse proxy the site and remove the frame busting header. But your proxy is not receiving or sending the end users cookies to the victim site. These secure sites rely on an active session, and as such will interpret the request from the proxy as coming from an unauthenticated user completely defeating the point of clickjacking.

Prevent https access to domain

We're seeing some minor issues with a cache and https on a non-secure site/domain.
The site doesn't have a certificate and therefore is not https. URLs are generated dynamically on the site though, so if someone manually visits the site using https, https will be used in the relative URLs - this is an issue for images and external resources as they then get blocked.
On an individual basis this doesn't worry me, the real issue is that if the above happens on a page that hasn't yet been cached, the cache reflects the https error in the URLs, and attempts to serve https resources to normal users, thereby leaving all external resources blocked. We've just seen this happen on a set of pages.
I have no idea why someone would be attempting to access the site using https, but unfortunately it seems to happen.
Is there any way to route all https traffic directly to http? It's a frustrating issue as the site doesn't purport to be secure, doesn't have a certificate, and therefore security issues like this are meaningless - ideally https requests should just be being parsed as http.
Any help appreciated!
Edit:
Turns out after some research that this may not be possible. The method suggested in the comments should work (with the addition of an on/off flag for the condition), however it only works if SSL is available. i.e. if https isn't working in the first place, it doesn't seem you can build conditions around it.

Is there any reason not to serve https content on a page served over http?

I currently have image content being served on a domain that is only accessible over https. What is the downside of serving an image with an https path on a page accessed over http? Are there any caching considerations? I'm using an HttpRuntime.Cache object to store the absolute image path, which is retrieved from a database.
I assume there is no benefit to using protocol-relative URLs if the image is only accessible over https?
Is there a compelling reason why I should set up a separate virtual directory to also serve the image content over http?
If the content served over HTTPS within the HTTP page isn't particularly sensitive and could equally be served over HTTP, there is no downside (perhaps some performance issues, not necessarily much, and lack of caching, depending on how your server is configured: you can cache some HTTPS content).
If the content server over HTTPS is sufficiently sensitive to motivate the usage of HTTPS, this is really bad practice.
Checking that HTTPS is used and used correctly is solely the responsibility of the client and its user (this is why automatic redirections from HTTP to HTTPS are only partly useful, for example). Although some of it has to do with the technicalities of certificate verification, a lot of the security offered by HTTPS comes from the fact that the user:
expects to be using HTTPS (otherwise they could easily be downgraded),
is able to verify the validity of the certificate: green/blue bar, corresponding to the host name on which they expect to be.
The first point can be addressed by HTTP Strict Transport Security, from a technical point of view.
The second needs used interaction. If you go to your bank's website, it must not only be a site with a valid certificate, but you should also check that it's indeed the domain name of your bank, for example.
Embedding HTTPS content in an HTTP page defeats this, since the user can't check which site is being used, and that HTTPS is used at all in fact. To some extent, embedding HTTPS content from a third party in an HTTPS page also presents this problem (this is one of the problems with 3-D Secure, which may well be served using HTTPS, but using an iframe doesn't make which site is actually used visible.)

What are the pros and cons of a 100% HTTPS site?

First, let me admit that what I know about HTTPS is pretty rudimentary. I don't know much about session security, encryption, or how either of those things is supposed to be done.
What I do know is that web security is important; that horror stories of XSS, CSRF, and database injections pop up over and over again. I know that a preventative stance against such exploits is better than a reactive one.
But the motivation for this question comes from a different point of view. I work at a site that regularly accepts payment from users. Obviously, the payments are sent over a secure channel (HTTPS). I mainly work on the CSS, HTML, and JavaScript of the site. What I've been told is that it is necessary to duplicate CSS, JavaScript, and image files before they can be called over HTTPS. So assume I have the following files:
css/global.css
js/global.js
images/
logo.png
bg.png
The way I understand it, these files need to be duplicated before they can be "added" to the HTTPS. So a file can either be under security (HTTPS) or not.
If this is true, then this is a major hindrance. In even the smallest site, it would be a major pain to duplicate files and then have to maintain them every time you make a CSS or JS change. Obviously this could be alleviated by moving everything into the HTTPS.
So what I want to know is, what are the pros and cons of a site that is completely behind HTTPS? Does it cause noticeable overhead? Is it just foolish to place the entire site under encryption? Would users feel safer seeing the "secure" notifications in their browser during their entire visit? And last but not least, does it truly make for a more secure site? What can HTTPS not protect against?
You can serve the same content via HTTPS as you do via HTTP (just point it to the same document root).
Cons that may be major or minor, depending:
serving content over HTTPS is slower than serving it via HTTP.
certificates signed by well-known authorities can be expensive
if you don't have a certificate signed by a trusted authority (eg, you sign it yourself), visitors will get a warning
Those are pretty basic, but just a few things to note. Also, personally, I feel much better seeing that the entire site is HTTPS if it's anything related to financial stuff, obviously, but as far as general browsing, no, I don't care.
Noticeable overhead? Yes, but that matters less and less these days as clients and servers are much faster.
You don't need to make a copy of everything, but you do need to make those files accessible via HTTPS. Your HTTPS and HTTP services can use the same doc root.
Is it foolish to put the whole site under encryption? Typically no.
Would users feel safer? Probably.
Does it truly make for a more secure site? Only when dealing with the communication channel between the client and the server. Everything else is still up for grabs.
You've been misinformed. The css, js, and image files need not be duplicated assuming you've set up the http and https mapping to point to the same physical website on the server. The only important thing is that these files are referenced with https when the page you're looking at is also under https. This will prevent the dreaded security message that says that some objects on the page are not secured.
For every other page where you're running the site under http (unsecured) you can reference those same files in the same locations, but with an http address.
To answer your other question, there would indeed be a performance penalty to put the entire site under https. The server has to work hard to encrypt everything it sends over the wire. And then some not-so-old browsers won't cache https content to disk by default, which of course will result in an even heavier load on the server.
Because I like my sites to be as responsive as possible, I'm always selective about which sections of a site I choose to be SSL-encrypted. In most typical e-commerce sites, the only pages that need SSL encryption are the login, registration, and checkout pages.
The traditional reason for not having the entire site behind SSL is processing time. It does take more work for both the client and the server to use SSL. However, this overhead is fairly small compared to modern processors.
If you are running a very large site, you may need to scale slightly faster if you are encrypting everything.
You also need to buy a certificate, or use a self signed one which may not be trusted by your users.
You also need a dedicated IP address. If you are on a shared hosting system, you need to have an IP that you can dedicate to only having SSL on your site.
But if you can afford a certificate and private ip and don't mind needed a slightly faster server, using SSL on your entire site is a great idea.
With the number of attacks that SSL mitigates, I would say do it.
You do not need multiple copies of these files for them to work with HTTPs. You may need to have 2 copies of these files if the hosting setup has been configured in such that you have a separate https directory. So to answer your question - no duplicate files are not required for HTTPs but depending on the web hosting configuration - they may be.
In regards to the pros and cons of https vs http there are already a few posts addressing that.
HTTP vs HTTPS performance
HTTPS vs HTTP speed comparison
HTTPs only encrypts the data between the client computer and the server. It does not software holes or issues such as remote javascript includes. HTTPs doesn't make your application better - it only helps secure the data between the user and your app. You need to make sure your app has no security holes, practice filtering all data, SQL, and review security logs frequently.
However if you're only responsible for the frontend part of the site I wouldn't worry about it but would bring up concerns of security with the main developer for the backend.
One of the concerns is that https traffic could be blocked, for example on Apple computers if you set parental control on it blocks https traffic because it can't read the encrypted content, you can read here:
http://support.apple.com/kb/ht2900
https note: For websites that use SSL
encryption (the URL will usually begin
with https), the Internet content
filter is unable to examine the
encrypted content of the page. For
this reason, encrypted websites must
be explicitly allowed using the Always
Allow list. Encrypted websites that
are not on the Always Allow list will
be blocked by the automatic Internet
content filter.
An important "pro" for more https at your site is the following:
a user connecting thru an unencrypted WiFi, like at an airport, can give their password in https, but if the site then switches back to http after the password page, the session cookie becomes exposed and can be immediately used by an eavesdropper.
See this article http://steve.grc.com/2010/10/28/why-firesheeps-time-has-come/#comment-2666

Resources