Secure cookies aren't set when navigating in iframe - security

I have an application where a container app loads site <inner-site>.com in an iframe. All the ajax calls in the iframe fails due to bad request. This was due to ajax calls not having _csrf cookie in the request. For some reason, iframe doesn't have the identity cookies at-all.
Problem mentioned here: https://gist.github.com/iansltx/18caf551baaa60b79206 and the proposed solution was to remove the same-site requirement for identity cookies. That didn't sound like a good plan.
Then, I found more elegant way to share cookies between sites here: https://gist.github.com/iansltx/18caf551baaa60b79206
I liked the solution since the client and server work together and I could limit who to trust from <inner-site>.
This solution also worked on localhost which was awesome since we have been trying to fix this issue for weeks by now.
However, after deploying the change to the prod, it no longer worked. Not sure what might be causing this issue or if the limitations were more relaxed in localhost that it worked fine.
I'm looking for help figuring out either
why the solution above didn't work for the real server
or if there's another solution for this problem.
I'm using Yii2 but any solution should be applicable. Mentioned Yii2 to see if it might be enforcing some extra limitation or if it has some utility to fix this issue easier.

Related

Sweet-Captcha Security Vulnerability

We've recently started using the user-friendly website captcha sweet-captcha on our websites. After a recent round of security audits we found a potential vulnerability.
This vulnerability allows an attacker to circumvent the captcha indefinitely after one successful solve.
I've tried contacting the captcha creators regarding this but have not had any response. I am posting here primarily in the hope that my implementation is incorrect and as such a more secure alternative is immediately available.
The captcha is included in our pages, as per the documentation:
<?php echo $sweetcaptcha->get_html() ?>
Most of our websites are DHTML to avoid reloading which is what made us aware of the security issue as follows:
Someone solves the captcha and submits an ajax request to our php web service which includes the necessary captcha keys.
The web service validates the captcha as per the documentation (see $sweetcaptcha->check below), performs some work and then returns its response to the front end.
As the front end is not refreshed, and thus the captcha remains solved, it has become apparent that the same captcha keys can be used again to make as many requests as desired following an initial successful solve.
To solve this security problem we believe the following step should be occurring:
Invalidate the captcha response in the php web service to prevent an individual using the same captcha tokens more than one, e.g. if a call too:
$sweetcaptcha->check(array('sckey' => $_POST['sckey'], 'scvalue' => $_POST['scvalue']))
returns true, it should return false on all subsequent evaluations using the same parameters. That is not happening. Even though we could implement our own backend solution to prevent duplicate validations this would be best solved in the captchas existing code if it should be the case.
If anyone could advise on the above issue that would be greatly appreciated.

Is it possible to prevent a web browser from saving website credentials?

I have a website protected by basic auth, so when someone hits it, they get a standard username/password box. This website is accessed often from shared computers.
Is there any way to prevent the various "Remember my Credentials" functionality on browsers? I would like to prevent any browser from saving this username/password in any client-side repository.
I realize this is purely a function of the browser, but is there any commonly-accepted HTTP header or any other method of asking a website not to do this?
The answer is no. I'm really sorry but even if you could do this how would you stop an independent add-in from scraping web sites for username and password boxes and then storing that information.
Even if this were possible in code it could be easily circumvented by simply ignoring any directives you give. Thus rendering such headers pointless.
As stated before it is not possible to circumvent a browser feature it could easily save the credentials before it sent the POST thus stopping anything on the server from preventing it being cached.

Multi-Domain Login

I'm working on a little node.js-project, and while googling alot, I kinda got a bit confused, but maybe some of you are able to point me towards the road again.
Several websites are generated by DocPad (excellent piece of software), and hosted on different domains.
All these websites shall now get a "login module" (which is also written in Node.js, using passport). Visually, it will look similar to the excellent login-slider from Web-Kreation (Here a demo). My plan was to use nginx and route all the /login-requests to the login-app, which is working fine.
The problem is rather related to the multiple domains, and the clientside implementation of it all. All logins use the same database.
Can I somehow use both together, and create the session-cookies from the Login-Module (which could use the same domain all the time)?
I'm answering my own question for reference, in case someone else comes across the same problem.
In the end, I solved my problem by having a bit of a different setup. Instead of a module, using the dns of each page, I use a central login-application for all sites. The sites itself do not require to access any personal information, so that's not a problem.
DocPad is still being used to generate the different websites (works excellent - I know I say this very often, but if there's a brilliant piece of software out, there's no reason to not mention it once in a while) statically, and all static content is delivered to the user using a CDN.
The login-system is a node.js-application using Redis as the only database. It is integrated via a simple iframe on all pages rendered by DocPad on login.example.com.
After successful login in 'login-app' you can create encrypted string with info about current user. You can pass this string back in get/post parameter with redirect to necessary domain. Encription key is known only to the 'login-app' and your websites. You can trust this encrypted data. It is necessary to make sure that every time the key is different for the same user. For example you can add the information about the time of login or random. After decrypting the data you can set authorization cookie for a particular domain.

Fully cached dynamic website

I would like to cache my website with memcache as much as possible. There are rare modifications (somewhat like in a forum) which I am perfectly ok with re-caching once change is made. My only concern is login information (similar to how stackoverflow has a bar on top). This is how I am doing it right now:
$('div#user_bar').load('/login-info/');
(jQuery on a fully cached page loads up userinfo)
However, I think I can do without dynamic pages completely. My idea is this:
On login: create cookie `logged_in`:true
On each page: if JS finds cookie is set: show links to logout, settings, etc
if not: show link to login page
On logoff: delete cookie
No actual userinfo is stored in cookies, not even username.
How secure, reasonable, sane is this? Any ideas? Am I missing something? Thank you.
Disclaimer: This is more of an exercise than a production environment. But I am trying to keep security and performance in mind nonetheless.
About your main target: Caching dynamic pages is reasonable. If you work on the ASP.NET platform, you might want to have a look at the output cache feature which does exactly this, even including dynamic substitutions. 4 Guys from rolla.com have a nice starter article with links to all the details.
Regarding the non-userspecific pages: I doubt that this can work for anything but the most simple pages. Web applications usually allow different operations for different users, and if it's only the change of your password. You probably have to pass specialized content to the client at some point, and that's where the dynamic substitutions of the ASP.NET output cache come into play.

Why is the http auth UI so poor in browsers?

Why isn't there a logout button? Why no list of "websites you're logged into"? Is it because of some issue with the HTTP specs?
Life would be much easier for web developers if they could actually rely on HTTP auth ...
As far as HTTP is concerned, it is stateless. One of the main reasons why Internet is scalable.
No technical reason. I suppose if anything, the auth UI is neglected because fewer and fewer web sites are still using HTTP Basic Authentication, trending more towards various cookie-related login schemes... precisely because the auth UI is so poor!
One could probably hack together a Firefox add-on to do it quite easily, which would be the quickest fix. (And the same goes for the other question with the poor file upload UI too.) I'd use it!
Have you entered a bug report for major browsers ? (At least, ones with bug trackers, Firefox, Chrome (Chromium) etc.
List of open HTTP Auth sessions would be useful.
Because it's not the browser that "knows" it's logged in. It's the server which authenticates the browser on every request. Every server can have different authentication mechanism - using different names and content for the authentication cookies, basic authentication, etc.

Resources