Actually this is not a code problem but security problem.
I've got an email from stranger that contain about what he's done to my website.
I don't how he do it but there are proof showing he can get the api and the parameter when clicking button by burpSuite apps
is there anyway to protect website from this checking ?
Please prevent CRSS attack. Can follow the link- https://brightsec.com/blog/cross-site-scirpting-prevention/
Related
We've recently started using the user-friendly website captcha sweet-captcha on our websites. After a recent round of security audits we found a potential vulnerability.
This vulnerability allows an attacker to circumvent the captcha indefinitely after one successful solve.
I've tried contacting the captcha creators regarding this but have not had any response. I am posting here primarily in the hope that my implementation is incorrect and as such a more secure alternative is immediately available.
The captcha is included in our pages, as per the documentation:
<?php echo $sweetcaptcha->get_html() ?>
Most of our websites are DHTML to avoid reloading which is what made us aware of the security issue as follows:
Someone solves the captcha and submits an ajax request to our php web service which includes the necessary captcha keys.
The web service validates the captcha as per the documentation (see $sweetcaptcha->check below), performs some work and then returns its response to the front end.
As the front end is not refreshed, and thus the captcha remains solved, it has become apparent that the same captcha keys can be used again to make as many requests as desired following an initial successful solve.
To solve this security problem we believe the following step should be occurring:
Invalidate the captcha response in the php web service to prevent an individual using the same captcha tokens more than one, e.g. if a call too:
$sweetcaptcha->check(array('sckey' => $_POST['sckey'], 'scvalue' => $_POST['scvalue']))
returns true, it should return false on all subsequent evaluations using the same parameters. That is not happening. Even though we could implement our own backend solution to prevent duplicate validations this would be best solved in the captchas existing code if it should be the case.
If anyone could advise on the above issue that would be greatly appreciated.
I have a website protected by basic auth, so when someone hits it, they get a standard username/password box. This website is accessed often from shared computers.
Is there any way to prevent the various "Remember my Credentials" functionality on browsers? I would like to prevent any browser from saving this username/password in any client-side repository.
I realize this is purely a function of the browser, but is there any commonly-accepted HTTP header or any other method of asking a website not to do this?
The answer is no. I'm really sorry but even if you could do this how would you stop an independent add-in from scraping web sites for username and password boxes and then storing that information.
Even if this were possible in code it could be easily circumvented by simply ignoring any directives you give. Thus rendering such headers pointless.
As stated before it is not possible to circumvent a browser feature it could easily save the credentials before it sent the POST thus stopping anything on the server from preventing it being cached.
I am trying to test a webpage using Nessus. I have tested all the stuff about the Server. But now I want to proceed by login to the webpage and test all possible pages behind the login form. But I couldn't achieve it. I gave all(text, password and hidden fields) the form fields' values including the ticket generated by Central Authentication System. But nothing happens. Either there isn't any security issue behind the login page ( :P ), or I couldn't login to the page (100% possibility :D ). For extra info:
These are login fields. ;)
username=
&password=
<=_c0C1F5872-F217-B20F-6D86-AA3AA1C1262E_kC7BEB4F7-5216-53EB-2F9A-7FDDFE01D145
&_eventId=submit
&submit=Login
Is there anyone who used Nessus and know how to solve this problem? And is there anyone who knows how to import Cookies to Nessus?
Thanks in advance. ;)
I had similar problems; can't speak for you, but sounds like you have about as much website knowledge as I do (which ain't much!) - no offense intended. In my case I'm not sure I'm understanding the most most basic structural elements of the website, such as what URL to point the scan at, and then concatenating that correctly with the login pages in the policy. I'm far better at the network and infrastructure penetration testing :D
I did a search in a search engine for "Nessus HTTP cookie import", and found that Tenable discussed this on their podcast, episode 14:
http://blog.tenablesecurity.com/2009/11/tenable-network-security-podcast---episode-14.html
If you look at the "Stories" note on the above web page, there's a hint to use the "Export Cookies" Firefox add-on. The add-on has some guidance, but essentially:
Install the add-on to your browser (I'm using the OWASP Mantra browser; I urge you to look at it)
Restart your browser
Login into the subject website and authenticate
From the Tools menu, go for "Export Cookies"
Save to file, and point your Nessus scan policy at that file
NOTE: I'm still trying this now, but thought I'd post the possibility anyway in case I forget - I will update this thread with a confirm or deny shortly.
Best of luck!
UPDATE: Well, it didn't work for me on first attempt. I'm confirming I don't have any conflicting or superseding settings in the policy, but if that doesn't work it's on to Tenable Support, I fear...
According to the documentation, besides importing cookies, the other way to do it (currently at 7.0) is:
Create new scan
Web Application Tests
Credentials:
which are filled out like these (taken from documentation):
Username: Login user’s name.
Password: Password of the user specified.
Login page: The absolute path to the login page of the application, e.g., /login.html
Login submission page: The action parameter for the form method. For example, the login form for: <form method="POST" name="auth_form" action="/login.php"> would be: /login.php
Login parameters: Specify the authentication parameters (e.g., login=%USER%&password=%PASS%). If the keywords %USER% and %PASS% are used, they will be substituted with values supplied on the Login configurations drop-down menu. This field can be used to provide
more than two parameters if required (e.g., a group name or some other piece of information is required for the authentication process).
Check authentication on page: The absolute path of a protected web page that requires authentication, to better assist Nessus in determining authentication status, e.g., /admin.html.
Regex to verify successful authentication: A regex pattern to look for on the login page. Simply receiving a 200 response code is not always sufficient to determine session state. Nessus can attempt to match a given string such as Authentication successful
However, looking at the reports, in my case, it couldn't authenticate for some reason
I am using Jboss application server.
I have implemented the whole website on ssl (https), the site is working fine on internet explorer browser but the site displays the below information on Mozilla/Konqueror browser but only on a particular page.
Security Warning :
Although this page is encrypted, the information you hav enetered is to be sent over an unencrypted connection and could easily be read by a third party.
Are you sure you want to continue ?
continue cancel
Is it a Jboss feature ?
If the whole site is running on https then why only n a particular page this information dislays ?
What should I do to get rid of this problem ?
Please do help me !!!!!!!!! My mailid is [redacted]
Thanks and regards,
AKhtar Bhat
Check for images or other resources that are being requested from non-SSL locations. This is usually the problem.
Posting your E-mail address here is probably not the best idea, it is likely to be harvested by spammers (although Gmail does have a fairly effective spam filter). It is also extremely unlikely anyone is going to answer you in E-mail since that would defeat the purpose of Stack Overflow, which is probably why your question was voted down.
To resolve your problem, you must find the page that is presenting the dialog or message you are seeing, then view the source of the webpage. Ensure the action attribute of any <form> tags are either relative to the current server (i.e. - action="/some/path/...") or are absolute but are directed to https (i.e. - action="https://some_server/some/path/...").
If you are using any AJAX calls, you must also ensure they are using https.
It seems unlikely the message is a result of resources being sent to you insecurely. It seems much more likely the message is a result of a <form> tag with an incorrect action attribute or an insecure AJAX call.
Check the FORM tag for ACTION property.
The action should be an HTTPS URL (which it should be if relative but may not be if absolute.)
I think you'll find some information on ths page that should help you
http://blog.httpwatch.com/2009/04/23/fixing-the-ie-8-warning-do-you-want-to-view-only-the-webpage-content-that-was-delivered-securely/
Is it possible to secure only the Login.aspx page (and the postback) and not the whole site in IIS?
We are looking to do this specifically with a SharePoint site running Forms Based Authentication against our Active Directory.
Links to this will be helpful.
This is what we have done so far:
1. Setup SharePoint to use FBA against AD.
2. Moved Login Page to Secure/Login.aspx
3. Set the appropriate Login url in web.config as https://..../Secure/Login.aspx
This is not working and help is needed here.
However even if this works, how do we get the user back to http from https?
There's not a whole lot of point. If the only thing that's encrypted is the Login.aspx page, that would mean that someone could sniff all the traffic that was not sent through the login page.
Which might prevent people from getting user:pass, but all your other data is exposed.
Besides all the data which is exposed, and the user's operation which can be changed en route, the user's session id (or other authentication data) is sent in the clear. This means that an attacker can steal your cookie (...) and impersonate you to the system, even without getting your password. (If I remember correctly SPSv.3 also supports builtin password changing module...)
So I would say that this is not a Great Idea, unless you dont care about that system very much anyway.... But then, why bother with authentication at all? just make it anonymous?
I agree with AviD and Dan Williams that securing only the login page isn't a great idea because it exposes other data after leaving the password page. However, you can require SSL for only the login.aspx page via the IIS Manger. If you navigate to the login.aspx page in IIS Manager (I believe it's under /_layouts), you can right-click on the individual file and select Properties. From there, go to the File Security tab and click on the Edit... button under Secure communications. There, you can check the Require secure channel (SSL) box, and SSL will be required for that page only.
I'm not positive about getting the user back to http from there, but I believe its default behavior is to send you to the requested page if the login is successful. If not, I would think you could customize where the login page sends you on a successful login.