Is it possible to detect Internet Explorer Enhanced Security Configuration in javascript? - security

Is there any method to tell from javascript if the browser has "enhanced security configuration" enabled?
I keep running into problems with certain controls not working from within dynamically loaded content. This only happens with browsers running on Windows Server 2003/2008 systems - even when I add the server to the "trusted" zone.
Maybe somebody has already develoepd a method for accomplishing this task?
Thanks in advance

Instead of testing for IE ESC directly, we can test for its effects.
I found that with ESC enabled the onclick events of dynamically added content would not fire.
So I am testing those events directly.
var IEESCEnabled = true;
var testButton = $("<button style=\"display: none;\" onclick=\"IEESCEnabled = false; alert('No problems here.');\">Test IE ESC</button>");
testButton.click();
if (IEESCEnabled) {
alert("We have a problem.");
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
In my application a test like this forwards the user to a page explaining their issue. It is accompanied by a noscript element to check that they have JavaScript running at all.

I don't think it's possible, and if it still is, than that's a bug that might sooner or later be fixed.
One of the main points of this "extra security" was for the client to have it but not to be detected by the servers, thus leaving them no way to know when to try to circumvent it and when not.

Isn't javascript disabled when using enhanced security configuration?
Then if you only want to display a message to the user, simply display a message in normal html and hide it with javascript so only users without javascript will see it. If you need to handle it on the server side (e.g. outputting a differerent version of your website) simply include javascript to redirect users to your javascript enabled version. Users without javascript will remain on the non-js page.
If only scriptable activex are disabled, the same method applies, simply insert a activeX and try to "script" it, if it fails you can redirect, show a message etc.
The above of course doesn't detect enhanced security configuration per se, but the symptons that occur when it is enabled. So it probably wouldn't be able to distinguish between users with using enhanced security configuration and users that simply have JS/ActiveX disabled or use a Browser that doesn't support scripting in the first place.

I think you can look for SV1 in the user agent string.

Related

Can' call XSP functions in XPINC

I try to call XSP._isDirty() for XPINC but it does not work. In the browser everything works fine. Is there a trick how i can use it.
Is there a way how i can see clientside errors when i'm executing XPages in the Notesclient?
Two questions here.
Q1. XSP._isDirty()
XSP._isDirty() is an internal call. From the XPages portable command guide (page 156).
XSP._isDirty() : Used internally by the Dirty Save feature— see the <xp:view> properties for enableModifiedFlag. This is a private function.
Code for this call is in the file xspClientDojo.js (look for the uncompressed file on Domino/Notes).
As it is an internal call it is used at risk. There is no guarantee it will work as expected in later versions.
The enableModifiedFlag is an XPage attribute that allows you to mark the page as dirty and prevent the user accidentally leaving the page. There are more details about this on the Infocenter.
Q2. Client side debugging.
You can review client side errors using the developer panel of most modern browsers, or something like the firebug plugin. The XPages extension library comes with a Firebug Lite component you can use as well.
For SSJS and XSP engine issues you can review these in the Notes client by reading the XPages logs in the IBM_TECHNICAL_SUPPORT folder contained in the Notes data folder.
For a "live" method of this is to modify the shortcut that launches notes as follows:
Target: C:\Lotus\Notes\notes.exe -RPARAMS -console -debug -separateSysLogFiles -consoleLog
Start In : C:\Lotus\Notes\framework\
Change the path to match your clients install.

how to check if my website is being accessed using a crawler?

how to check if a certain page is being accessed from a crawler or a script that fires contineous requests?
I need to make sure that the site is only being accessed from a web browser.
Thanks.
This question is a great place to start:
Detecting 'stealth' web-crawlers
Original post:
This would take a bit to engineer a solution.
I can think of three things to look for right off the bat:
One, the user agent. If the spider is google or bing or anything else it will identify it's self.
Two, if the spider is malicious, it will most likely emulate the headers of a normal browser. Finger print it, if it's IE. Use JavaScript to check for an active X object.
Three, take note of what it's accessing and how regularly. If the content takes the average human X amount of seconds to view, then you can use that as a place to start when trying to determine if it's humanly possible to consume the data that fast. This is tricky, you'll most likely have to rely on cookies. An IP can be shared by multiple users.
You can use the robots.txt file to block access to crawlers, or you can use javascript to detect the browser agent, and switch based on that. If I understood the first option is more appropriate, so:
User-agent: *
Disallow: /
Save that as robots.txt at the site root, and no automated system should check your site.
I had a similar issue in my web application because I created some bulky data in the database for each user that browsed into the site and the crawlers were provoking loads of useless data being created. However I didn't want to deny access to crawlers because I wanted my site indexed and found; I just wanted to avoid creating useless data and reduce the time taken to crawl.
I solved the problem the following ways:
First, I used the HttpBrowserCapabilities.Crawler property from the .NET Framework (since 2.0) which indicates whether the browser is a search engine Web crawler. You can access to it from anywhere in the code:
ASP.NET C# code behind:
bool isCrawler = HttpContext.Current.Request.Browser.Crawler;
ASP.NET HTML:
Is crawler? = <%=HttpContext.Current.Request.Browser.Crawler %>
ASP.NET Javascript:
<script type="text/javascript">
var isCrawler = <%=HttpContext.Current.Request.Browser.Crawler.ToString().ToLower() %>
</script>
The problem of this approach is that it is not 100% reliable against unidentified or masked crawlers but maybe it is useful in your case.
After that, I had to find a way to distinguish between automated robots (crawlers, screen scrapers, etc.) and humans and I realised that the solution required some kind of interactivity such as clicking on a button. Well, some of the crawlers do process javascript and it is very obvious they would use the onclick event of a button element but not if it is a non interactive element such as a div. The following is the HTML / Javascript code I used in my web application www.so-much-to-do.com to implement this feature:
<div
class="all rndCorner"
style="cursor:pointer;border:3;border-style:groove;text-align:center;font-size:medium;font-weight:bold"
onclick="$TodoApp.$AddSampleTree()">
Please click here to create your own set of sample tasks to do
</div>
This approach has been working impeccably until now, although crawlers could be changed to be even more clever, maybe after reading this article :D

How to open new window's from XUL Browser?

I'm wondering, is it even possible to treat the request for the Xul Browser component to open a new window? I tried changing the window.open function, but looks like it's never called.
All links that open in a new window are not opening in my application.
I found this page on the subject, but the provided solution is showing no different behavior.
Any hint on this?
(by the way, I'm developing a stand alone application, not a Firefox's extension)
I'm assuming you are in a XULRunner application, and that you are trying to load a chrome URL from a non-chrome source in a browser (e.g. HTTP or local file). While enabling UniversalXPConnect and UniversalBrowserWrite can be helpful, they are also a security risk (since any arbitrary script on the web could use them), so they tend to be disabled in browsers (for example, running that line in Firebug will give you an exception):
>>> netscape.security.PrivilegeManager.enablePrivilege("UniversalXPConnect UniversalBrowserWrite");
Error: A script from "http://stackoverflow.com" was denied UniversalXPConnect UniversalBrowserWrite privileges.
How about you try using codebase security principals and see if that makes a difference? (http://www.mozilla.org/projects/security/components/signed-scripts.html#codebase). For me in Firebug it does allow me to get the additional permissions after I OK it with a big, nasty looking dialog), but still doesn't allow me to open a Chrome URL with window.open. The next step is probably to try changing your conf file to use contentaccessible so that the relevant parts of your content are accessible (see https://developer.mozilla.org/en/Chrome_Registration#contentaccessible).
To avoid the nasty message when elevating permissions, you could try setting permissions for the right files automatically as described at http://forums.mozillazine.org/viewtopic.php?f=38&t=1769555.
Also, make sure you check the browser type (https://developer.mozilla.org/en/XUL/Attribute/browser.type). If the browser type is not chrome, then it might be worth trying making it chrome and seeing if that makes a difference.
If any of my assumptions are wrong get back to me and I will try something else.
does normal js not work?
window.open(url,windowname,flags);
There are two ways that I know of.
The first is to set the browser.chromeURL preference to a chrome URL that contains a <browser type="content-primary">. The page that the content window tried to open will load into the given browser.
The second is to set the property window.browserDOMWindow with an object that you define to implement the nsIBrowserDOMWindow interface. This allows you to divert the open call into a tab, if you are using a tabbed interface. Note: the tabbed browsing preferences must be set to allow windows to be diverted into tabs, otherwise XULrunner will fall back on browser.chromeURL.

Secure isolated iFrame? Alternative?

I am running into a problem. I want to host an external page securely. Meaning, no JavaScript in the iFrame. Or it only execute safe code, such as change the text of its page or set the color of its page. And I want to keep CSS alive.
They should look the same from the source, but, no melacious code running behind. No ActiveX, no Flash, no Plug-in. I want them look correct without all the security compromise.
I have tried jQuery load(), but, it only works for internal pages, not external pages. And the CSS in that DIV overwrite my site's CSS, which is not what I wanted.
I am looking for an isolated frame like iframe. But, without security problem. Is this possible?
HTML5 now has a 'sandbox' option for iframes.
This will allow you to block code inside the iframe.
You can learn more at:
https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe
You can create a server side stateful proxy, like a php script that read the remote page and clean whatever you don't like. Not a really simple thing to do, but I'm afraid there is no really easy way around.
I mean, for instance, you create proxy.php:
<?php
$remote = file($_GET['remote']);
// .. filter whatever you like in $remote then print it
And then link to a site using
<iframe src="proxy.php?remote=http://www.example.com"></iframe>
This is not a complete example, just a way of showing my idea.

Browser plugin which can register its own protocol

I need to implement a browser plugin which can register its own protocol (like someprotocol://someurl) and be able to handle calls to this protocol (like user clicking on 'someprotocol' link calls function inside my plugin). As far as I understand, Skype does something similar, except I need to handle links within page context and not in a separate app. Any advice on how this can be done? Can this be done without installing my own plugin, with the help of flash/java?
Things are going to be slightly more complicated than you think.
You're going to have to create an entire application, not just a browser plugin (that plugin can be part of your application). The reason I consider it to be a complete application is that you're going to need to modify registry settings on the client machine to register your custom URL handler.
Here's an MSDN article describing exactly what you have to do to register the custom URL handler on a Windows client:
Registering an Application to a URL Protocol

Resources