Do web browsers store information in forms while processing is happening? - browser

Im creating an application and i want to know whether or not the browser will store the information in the fields if an error occurs so i can just send the browser back or should i store everything in a session and then place things back as they were with the session? I am developing my application in JSP and Servlets.
Thanks in Advance
Dean

Sometimes they do, but generally they don't -- you shouldn't rely on it.

Related

Node js Cross-domain session

Here I will describe the requirement for my project.
Basically I want to build a chat application which I can embed to different websites for example , site build using wordpress, magento, drupal, custom frameworks ... etc . What I actually need is to embed JavaScript for handling socket chat using (socket.io) on some of the website(wordpress, magento, drupal ....), so what I finally have is a set of javascript code (client side), and a server running in nodejs (with socket.io)
The problem I faced is to manage session for registered users after login. Since my code is embedded on different websites and the node server resides on other server , On each page refresh I faced difficult to validate user session session. Could you please help me how I can manage session in a best way for this application.
If you feel difficulty to understand my need , I can explain in detail with examples
Thanking You
If I understand your problem, you just need to handle user sessions? More specifically on the client side?
Based on the information you give, I will just assume you either return a unique string representing the session on the server to the client. The format of this can either be a cookie, a normal string/token, etc.
For cookies, you shouldn't have much problems, since the browser deals with this. Although you might need to set it up correctly on the server.
For tokens/strings that needs to be returned to the server for each request requiring authentication, you should store it in the session-storage/local storage of the browser, depending on your need. Then you should embed it in every requests back to the server and authenticate it.

Is it possible to prevent a web browser from saving website credentials?

I have a website protected by basic auth, so when someone hits it, they get a standard username/password box. This website is accessed often from shared computers.
Is there any way to prevent the various "Remember my Credentials" functionality on browsers? I would like to prevent any browser from saving this username/password in any client-side repository.
I realize this is purely a function of the browser, but is there any commonly-accepted HTTP header or any other method of asking a website not to do this?
The answer is no. I'm really sorry but even if you could do this how would you stop an independent add-in from scraping web sites for username and password boxes and then storing that information.
Even if this were possible in code it could be easily circumvented by simply ignoring any directives you give. Thus rendering such headers pointless.
As stated before it is not possible to circumvent a browser feature it could easily save the credentials before it sent the POST thus stopping anything on the server from preventing it being cached.

How to detect browser close at server side in asp.net?

I wanted to know when the browser is closed at server side in asp.net 2.0. How to detect in code behind?
Short answer: you can't do that directly since http is stateless. Perhaps you can use some AJAX hearbeat pooling, session timeout detection and other tricks.
Take a look at this question for more explanation and ideas. This is Java based, but ideas are language agnostic.
client side script:
< body onbeforeunload="window.open('http://www.website.com/browserclosed.aspx','mywindow','width=1,height=1');">
server side script (browserclosed.aspx):
// page_load
int userId = Convert.ToInt32(request.session("userId"));
ReportBrowserClosed(userId);
// Do what you want in ReportBrowserclosed() method
The first thing that comes to mind is that you hook the unload event and just asynchronously post back that the browser navigated away from your site (closed the window). However, the way that HTTP is being used to build stateless web sites makes this infeasible. You just don't have a reliable way of tracking user connectivity.
Just consider how would you handle multiple sessions? If I have the same site open in many and several, tabs or windows and close down all but one how do you tell that I'm still connected? And for the fun of it, say that my browser crashed somewhere there in between.
The thing is, I could design something that would sort of solve your problem. However, it's never going to be reliable because HTTP doesn't have a built-in control mechanism for connectivity.
I have to answer this question, with a follow up question. Why do you need to know when the browser window closes?
If you need to do some resource clean up there's two server side events, facilitated by ASP.NET that you could use more reliably. And that's Session_End or Application_End.
The quite obvious question is why do you need this? Do you want to store logout time or closing time? Then its better to catch in session timeout. Do you want to redirect some other page then its better to catch in page unload event of javascript.

Can local storage be maliciously edited client-side?

Is a user able to edit localstorage (and sessionstorage) items? Specifically, would a malicious user be able to edit it like cookies can be edited?
I am researching session info for a web application I am writing, and I had the idea of using localstorage for some items. Yes, I have looked into session variables, and I am probably going to use them, but I was just wondering this and could not find it anywhere. My project is built with jQuery and PHP. The interface is completely driven by jQuery, and I am using localstorage for some other info--that is why I thought of it.
Thanks!
Yes he can, actually you should always assume that anything that is done on client side
can be altered, of course JavaScript as well.
If you want to make sure that something is not altered you can use some kind of cryptographic
signature on data and validate it on server side.

Programmatic Bot Detection

I need to write some code to analyze whether or not a given user on our site is a bot. If it's a bot, we'll take some specific action. Looking at the User Agent is not something that is successful for anything but friendly bots, as you can specify any user agent you want in a bot. I'm after behaviors of unfriendly bots. Various ideas I've had so far are:
If you don't have a browser ID
If you don't have a session ID
Unable to write a cookie
Obviously, there are some cases where a legitimate user will look like a bot, but that's ok. Are there other programmatic ways to detect a bot, or either detect something that looks like a bot?
User agents can be faked. Captchas have been cracked. Valid cookies can be sent back to your server with page requests. Legitimate programs, such as Adobe Acrobat Pro can go in and download your web site in one session. Users can disable JavaScript. Since there is no standard measure of "normal" user behaviour, it cannot be differentiated from a bot.
In other words: it can't be done short of pulling the user into some form of interactive chat and hope they pass the Turing Test, then again, they could be a really good bot too.
Clarify why you want to exclude bots, and how tolerant you are of mis-classification.
That is, do you have to exclude every single bot at the expense of treating real users like bots? Or is it okay if bots crawl your site as long as they don't have a performance impact?
The only way to exclude all bots is to shut down your web site. A malicious user can distribute their bot to enough machines that you would not be able to distinguish their traffic from real users. Tricks like JavaScript and CSS will not stop a determined attacker.
If a "happy medium" is satisfactory, one trick that might be helpful is to hide links with CSS so that they are not visible to users in a browser, but are still in the HTML. Any agent that follows one of these "poison" links is a bot.
A simple test is javascript:
<script type="text/javascript">
document.write('<img src="/not-a-bot.' + 'php" style="display: none;">');
</script>
The not-a-bot.php can add something into the session to flag that the user is not a bot, then return a single pixel gif.
The URL is broken up to disguise it from the bot.
Here's an idea:
Most bots don't download css, javascript and images. They just parse the html.
If you could keep track in a user's session whether or not they download all of the above, e.g. by routing all of the download requests through a script that logs the attempts, then you could quickly identify users that only download the raw html (very few normal users will do this).
You say that it is okay that some users appear as bots, therefore,
Most bots don't run javascript. Use javascript to do an Ajax like call to the server that identifies this IP address as NonBot. Store that for a set period of time to identify future connections from this IP as good clients and to prevent further wasteful javascript calls.
For each session on the server you can determine if the user was at any point clicking or typing too fast. After a given number of repeats, set the "isRobot" flag to true and conserve resources within that session. Normally you don't tell the user that he's been robot-detected, since he'd just start a new session in that case.
Well, this is really for a particular page of the site. We don't want a bot submitting the form b/c it messes up tracking. Honestly, the friendly bots, Google, Yahoo, etc aren't a problem as they don't typically fill out the form to begin with. If we suspected someone of being a bot, we might show them a captcha image or something like that... If they passed, they're not a bot and the form submits...
I've heard things like putting a form in flash, or making the submit javascript, but I'd prefer not to prevent real users from using the site until I suspected they were a bot...
I think your idea with checking the session id will already be quite useful.
Another idea: You could check whether embedded resources are downloaded as well.
A bot which does not load images (e.g. to save time and bandwidth) should be distinguishable from a browser which typically will load images embedded into a page.
Such a check however might not be suited as a real-time check because you would have to analyze some sort of server log which might be time consuming.
Hey, thanks for all the responses. I think that a combination of a few suggestions will work well. Mainly, the hidden form element that times how fast the form was filled out, and possibly the "poison link" idea. I think that it will cover most basis. When you're talking about bots, you're not going to find them all, so there's no point thinking that you will... Silly bots.

Resources