We have a CMS system whose web interface gets served over HTTPS. This works beautifully for Firefox, but when we load it in IE6 or IE7, it complains that "This page contains both secure and nonsecure items."
I've loaded the page in Firefox and checked with Firebug, and every connection seems to be going through HTTPS, as should be the case.
Is there any way to tell what is causing IE to throw this apparently spurious error?
Firefox has a number of bugs in mixed content detection. Generally you should try using Fiddler to spot insecure resources.
If you install a tool I wrote (www.bayden.com/dl/scriptfreesetup.exe) you will get a different mixed content prompt which shows the exact URL of the first insecure resource on the page. That tool is basically a prototype and you should uninstall it when you're done with it.
Use Fiddler to watch the traffic between the server and IE.
Be sure to go to Tools > Fiddler Options... > HTTPS > and check 'Decrypt HTTPS traffic'
Any non-HTTPS traffic generated between any server and IE should be easy to spot in the Web Sessions list.
I used Eric's tool (thanks Eric you saved me hours...) and it turns out that IE6 treats a background image specified with a relative path as nonsecure content. Even though it actually requests it over https. So if you're stumped - converting your relative paths to absolute ones might really help...
Are one or more resources (CSS url-image ref overlooked easily) pointing to a subdomain that's not covered by the certificate (https://www.example.com vs https://static.example.com)?
If you can't see anything that isn't using SSL, then this is usually down to a broken SSL certificate somewhere. I don't know of anything off-hand that will tell you what exactly what the problem is, but you can get a list of everything that's loaded easily enough.
The media tab on Firefox's 'page info' dialog (right click on the page) will do it, it might also be worth having a go with Fiddler (which is an excellent, and extremely useful piece of software).
Related
Our website has been recently hacked (Joomla 1.5, hosted on VPS). Attacker added few php scripts that were redirecting to some ad sites. We have cleaned everything (or at least we think we did), and now everything works as it should.
However, links on Google (or Yahoo) that are pointing to our web site are still trying to include these php scripts (and returns 404 as these are deleted now). Direct links from browser works as they should.
We have cleaned site 10 days ago, so I do not think that something is cached at Google servers. Re-indexing should be done by now.
To reproduce this behavior:
Go to www.google.com
type in "anitex socks"
click any php link that starts with "anitexsocks.com"
You will get "The requested URL /wp-includes/client.php was not found on this server" + 404 error
Refresh page and everything works without issues
Why are only Google links making troubles?
Any help is welcome. Thanks!
As for the reason why this is happening, I installed a firefox add-on which blocks my browser's Referrer Header and then followed a Google link to your site and it worked fine. Then I disabled the add-on and the problem started occurring again.
This shows that there is still some malicious code running on your website which is checking all http requests to see if they come from Google (based on checking the HTTP Referrer header) and redirecting them to /wp-includes/client.php if they do,
To try to determine where this code may lie, try performing a recursive grep through all your www files on your server as well as your www configuration files,somewhere in there there must still be a reference to that client.php script, hopefully you can find and eliminate it.
That said, if it were my site and I knew a hacker had had free reign over my server to do whatever they wanted to it, I would not mess around with trying to undo the damage and would instead restore the most recent backup from before the site was hacked. You only have to miss one back door the hacker left in place and they can re-enter your site. After restoring backups, you should also upgrade/reconfigure the software they used to gain access in the first place so they can't simply rehack it in the same manner again.
I think this should be releated to IIS settings but don't know exactly what it is.
As you can see below, this login message pops up for each images, 8 images 8 times in Opera.
And the major browsers react to this page different.
IE9 works good(this is the reason why I found this problem now. It's internal site and almost every users use IE...)
Chrome(17.0.963.56 m) works good.
Safari(5.1.2) is also good.
Opera 11.61 has a problem like I said...
And FF SHOWS NO IMAGES and don't even ask for login. And Firebug says it's "NetworkError: 404 Not Found!".
I don't know what's going on.
This site requires to login and it's internal, so I can't give you the link. Sorry for the inconvenient.
And this site is running on Windows Server 2003. And the image containing folder is shared for web(I don't know why it's shared. But don't want to change the setting). I don't know this may cause this situation.
If Opera opens a user name/password dialog, the site is probably sending a WWW-Authenticate header in response to those image requests. You can open Opera's developer tools ("Tools > Advanced > Opera Dragonfly" or right-click in page and select "Inspect element") and use the network feature to inspect the full headers.
I don't know how you can disable this header if it is sent, it depends on the server settings and what type of server you're running, and I'm not at all familiar with Windows Server 2003.
We're building a mobile-friendly site to work in tandem with our client's MOSS 2007 internet site. We need to be able to redirect users who hit the home page and are using a mobile device.
Our original intention was to add a custom control to the home page page layout that would detect the current user's device and redirect to the mobile site accordingly. We quickly realised that this would not work as we are using the Output Caching functionality provided by SharePoint/Asp.Net. This means that the detection code will only run for the first visitor to the home page until the cache expires.
Our next idea was to build a custom HTTP Module and process the detection there. However, we are finding that the Output Caching is not allowing that either. If the cache is set while a mobile device is visiting all browsers are subsequently redirected to the mobile site (until the cache expires).
If we turn off output caching it works just fine - but we cannot turn output caching off, especically for the home page. We did investigate Substitution (Donut) Caching but this is not working due to the fact we are filtering the Asp.Net response within another HTTP Module that tidies up the rendered HTML for XHTML compatiblity reasons. I've also experimented with the output cache profile by setting it to vary-by-header property to "User-Agent" but I am getting mixed results and am also concerned at the memory implications of caching multipel versions of pages (we already have memory issues now and then).
It's possible we could run the redirection code in JavaScript but then we risk not detecting a lot of devices that don't have JavaScript enabled. This is a government website so the usage of JavaScript has to abide by accessibility guidelines.
Does anyone have any other ideas as to how we can solve this issue. Has anyone done this before? Perhaps in a different way?
Hope you can help, thanks.
p.s. I have also asked this question on SharePoint.SE but wanted to get as many eyes on this as possible.
I would suggest you to try ISAPI filters
I've actually solved this one I think. I've pretty much followed this article here - http://msdn.microsoft.com/en-us/library/ms550239.aspx. We have updated the code in that article to build a cache key based on whether the current page is the home page, whether the current user is using a mobile device and whether or not a cookie exists forcing the user to the full site. I will probably write this up as a blog post. When I do I will update this answer providing a link.
As a security measure, I want to hide the technology stack I am using on my server. What are effective ways to do this? I thought about
1) Use mod_rewrite or Rewrite Module to hide any page extensions like .php or .aspx
2) Turn off all error reporting
1b) use mod_rewrite to serve a misleading extension on purpose, like disguising a php page as aspx
2b) Throw misleading errors to go with 1b), making my php pages display asp-like errors.
This is an impossible task. You will have to modify the entire stack, in which case you will have just created new buggy versions that you now have to keep in sync with vendor's versions.
There's literally no way to do this without making your site less secure.
You can do lame stuff like remove X-Powered-By, or change the session generation scheme if it's using something like ASP or PHP which has a known one. The fact is it's not going to stop anyone who actually wants to know what you are running.
For basic examples (it goes much deeper than this), some web servers will accept any header, so I can say GET LOLOLOL HTTP/1.1 and it will still work. Some stacks will keep the session alive, some wont. You can also see what features are on the stack, since there are just so many on the web and there's no way on earth any stack supports the exact same set.
Don't show accurate page extensions
Don't use standard error pages
Make sure the web server and application layer (ASP.NET, PHP et al) hide their presence in browser headers)
If your error pages do anything dynamic, assume that this can fail somehow and have a set of static html pages you can serve that aren't the web servers default ones
Make sure all of our technolog stacks are configured to not show stack traces on any error pages served to machines other than the local server. If for some reason all your custom errors fail then the user may see the technology stack, but they won't have a window into the underlying code
You can add fake Server: and/or X-Powered-By headers to the response, pretending that it was generated by a different server. (Or, Server: My Unhackable Server)
Do browser plug-ins, such as the Yahoo toolbar or others, have the ability to set cookies on multiple domains as the user browses the web? Does the browser expose the necessary access to do this to a plug-in? If this varies across browsers, that would be helpful to know as well.
Thanks!
Cookies are stored in files and real plugins (i.e. ones using NPAPI rather than the browser's addon/extension engine) can read/write files. Hence, it's possible to do for any browser this way, although not really straightforward.
Firefox exposes cookies even to addons since there are cookie editor addons (that can edit cookie for any site).
Chrome/Chromium allows setting of cookies through "content scripts" that run in the context of a page (any page) - that's only in the beta branch so far, but soon to be in stable. However, the downside is that you might have to visit the site for it to work (you could fake that using iframes).
No idea about Opera.
The only one I have found that works quite well for creating/updating/viewing cookies is Firecookie