I am using Google Translate on my website. After I updated to HTTPS, Google Translate stopped working. I even used https://www.google.com/jsapi instead of http://www.google.com/jsapi, but this didn't help.
It is because most of the browsers don’t accept mixed content i.e. calling http resource from HTTPS site.However you can enable it forcefully in your browser.
Using HTTPS for calling jsapi wont be helpful in your case as the real problem occurs when this website internally calls http://www.google.com/inputtools/try/.
Related
in our company we need to implement a self hosted Rest Service that has to be deployed in the client workstations in order for our internal web applications to interact with them.
The web applications are in https, and we are not using, at the moment, the CSP headers.
Our concern is whether it's necessary to call the local service also in https or this can ne avoided (and so we can avoid to manage a certificate to deploy in every single workstation).
We made some trials with Chrome and Edge and it seems that the ajax calls are working also in plain http, but we would like to know if that is actually supported or not. Our internal web applications are not using, at the moment, the Content Security Policy headers.
Thank you!
On an HTTPS connection browsers will block HTTP content as mixed content, CSP will not change that. However, Chrome will allow mixed content on http://127.0.0.1 and http://localhost while Firefox will allow it on http://127.0.0.1, see note on https://developer.mozilla.org/en-US/docs/Web/Security/Mixed_content.
When you implement CSP you should include http://127.0.0.1 (or http://localhost) for the appropriate directive.
I am making a Chrome extension which makes an AJAX call to a local http server. The local server is not https. My extension doesn't work when visiting an https site, because of mixed-content rules.
This is disappointing because I thought the content scripts were totally isolated from the main DOM, so these rules wouldn't matter.
Is there a way to get around this?
You don't have to make the request from the content script itself.
You can delegate that to a background page by requesting it via Messaging.
Also, make sure you have host permissions for your local server. It may even solve the original issue.
We're seeing some minor issues with a cache and https on a non-secure site/domain.
The site doesn't have a certificate and therefore is not https. URLs are generated dynamically on the site though, so if someone manually visits the site using https, https will be used in the relative URLs - this is an issue for images and external resources as they then get blocked.
On an individual basis this doesn't worry me, the real issue is that if the above happens on a page that hasn't yet been cached, the cache reflects the https error in the URLs, and attempts to serve https resources to normal users, thereby leaving all external resources blocked. We've just seen this happen on a set of pages.
I have no idea why someone would be attempting to access the site using https, but unfortunately it seems to happen.
Is there any way to route all https traffic directly to http? It's a frustrating issue as the site doesn't purport to be secure, doesn't have a certificate, and therefore security issues like this are meaningless - ideally https requests should just be being parsed as http.
Any help appreciated!
Edit:
Turns out after some research that this may not be possible. The method suggested in the comments should work (with the addition of an on/off flag for the condition), however it only works if SSL is available. i.e. if https isn't working in the first place, it doesn't seem you can build conditions around it.
I've inherited an ASP.NET web site that has an SSL certificate bought via GoDaddy.
The problem is that the certificate seems to be invalid because of some "mixed content/resources" (I think that's how its called) coming from http sites.
Chrome is showing the red cross over the lock next to https, meaning it's unsecured. The popups says the following:
Click in "What do these mean?" goes here which says:
The [crossed-lock] icon appears when
Google Chrome detects high-risk mixed
content, such as JavaScript, on the
page or when the site presents an
invalid certificate.
The certificate is correct and valid because I tried creating a blank "Hi world" .aspx page and it's showing the green lock with no problems.
Reading a little bit, I found that I should only include images and javascript coming from https sites. The only thing it had coming from http was the addthis widget, but they support https, so I changed to https, but it's still saying that is unsecured.
I've searched for anything else coming from http in the source, but didn't find anything.
Is there some way (site, chrome extension, firefox extension, whatever) that will show exactly which are the resources that are "unsecured"?
I've never dealt with SSL/HTTPS certificates, but I need to fix this issue asap.
Check your site in http://www.whynopadlock.com, which will give you a list of url which is not consider as secure by your browser.
Check the chrome console
You will get it like this,
The page at https://xys displayed insecure content from http://asdasda.png.
Make it http site to https then it will work.
I've found the problem using the Chrome Developer Tools. It was a js that's embedding a flash from an 3rd party site which it's using http.
Are you on Windows? Download and run Fiddler while browsing the site, and watch for HTTP connections.
Mixed content means contents of a web page are mixed with HTTP and HTTPS links.
These links include your JS, CSS, Image, Video, Audio, Iframes etc.
If your website is enabled for HTTPS (SSL certificate has installed), make sure you serve only HTTPS contents throughout your web page.
I have an interesting issue with HTTPS ports not being handled properly. It is a relatively small issue and I bet it is pretty simple to solve, I am just not thinking of it.
We have a website served with IIS 6, www.mylongdomainname.com. We have a secure portal which is handled via https://www.mylongdomainname.com. Now we have several vanity and marketing URLs that we use over the phone like www.shortname.com, etc. I have two websites setup, one that handles all request with the header www.mylongdomain.com which actually serves the website. The other accepts any traffic and permanently redirects to www.mylongdomain.com. This way if we ever add any more domains, they will all end up at the one, also it redirect mylongdomain.com to www.mylongdomain.com.
Everything here works fine. The issue now is when I google "shortname.com," the first result returned is the same as if I were googling "mylongdomain" however, google has been able to crawl the other pages via https://shortname.com and index them that way. We dont have SSL certificates for these other domains, so when you click through, you get a nasty un-trusted error.
This really wouldn't be an issue if we didn't use these URLs over the phone, and you all know how many people don't know the difference between the URL bar and a search box.
any suggestions or tips?
I'd set up a redirect so that https://shortname.com is sent to http://shortname.com with a 301 (permanent) redirect. This will put an end to the nasty untrusted error immediately. Furthermore, this will also cause Google to slowly but surely update their index.
There are multiple ways to do this. If you're using IIS7 you can use the URL Rewrite Module and write a redirect rule to take care of it.
Or if you're not on IIS7 it may be perfectly acceptable to write some code to accomplish this. I wrote some ASP.NET I've used plenty of times to take care of this HTTP/HTTPS redirection. In your particular case you could simply take my code and call SetSSL(False) in the Application_BeginRequest function of your global.asax.