Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am trying to imitate the HTTP Splitting attack on my machine.
For that I wrote this php code:
<?php
header("Location: " . $_GET['page']);
?>
And then I enter the following URL:
http://localhost/webgoat/httpsplitting.php?page=index%0aContent-Length:%200%0a%0aHTTP/1.1%20200%20OK%0aContent-Type:%20text/html%0aContent-Length:%2017%0a%0a<html>Hacked</html>
But then also when I intercept the request using webscarab, I see that these headers are not included in the web server's response.
Additionally I saw in wireshark that the LF sequence (i.e. %0a) is not converted into its ASCII format and is used as a string and not as a line feed.
So, I came to the deduction that modern web browsers are not susceptible to this attack. Am I correct ??
The attack is not only in the browser level, but also in caching and proxy servers! So, even if the browsers added protection, it might not be enough.
See the paper (search for proxy, for example):
http://www.securityfocus.com/archive/1/425593
And more recent one:
http://www.securityfocus.com/archive/1/425593
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
I did an interesting experiment today.
I opened Amazon.com in my browser, logged in, brought up Fiddler, and tried to add a brand new credit card.
I typed in my credit card number, expiration, and card holder name. When I submitted the request I didn't see any POST to Amazon in Fiddler. The UI said there was a problem submitting my information, and that I should try again.
I repeated it and got the identical response.
I shut down Fiddler and hit submit. My information was accepted instantly.
I'd like to know how Amazon accomplished this feat. Is it common knowledge? Is there an HTTP header involving certificates that makes it easy?
I think it is certificate pinning or something like it. Server certificate is pinned in application, so app accept only it, not any other certs even they are valid.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Is there any way you can be sure that web application you want to register on is encrptying your passwords and not sotirng them in plain text (where admin can read them or attacker could easily get them)?
Unless you are able to read the source code of the script processing your password, there is no way to know what's happening to it behind the scenes.
But there are some things about security you can find out on the client side, just to get a feel about the kind of security level this web application adheres to.
Check if the website is using a valid SSL certificate. This already tells you something on how feasible it is for someone to do network sniffing.
Have a look in the HTML source, and see how the form submitting your data is built. Is it using a POST request and not a GET?
Register with a fake account and check your cookies. Do you see anything that looks like your session information is saved in plain text or base64? And if something looks like base64 (the string ends with = or ==), decode it and see what the string really contains.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about programming within the scope defined in the help center.
Improve this question
Let's take a wordpress site as an example, as you guys know, wordpress' admin panel can be accessed only through site/wp-admin. My question is if I used .htaccess file to deny all access from other IPs except mine, will my site be safed from hackers?
Note: To keep things simple, let's assume that the site contains only static contents, with the contents retrieved from the database, IMHO if there's no input for the hackers than there's no way that the site can be hacked with XSS, sql injection, etc. Please correct me if I'm wrong even a wise man like me can be wrong. :)
.htaccess is useful for security, but it does not guarantee invulnerability from hackers, even with all of your assumptions. http://www.cmswire.com/cms/web-cms/how-they-hack-your-website-overview-of-common-techniques-002339.php
Using .htaccess alone to restrict access to a URL by IP address is a "good enough" solution in that most of the time it will work just fine. There are certain pitfalls however, like if your IP address changes, you have to go into the console and update your .htaccess file, which isn't a huge deal but also inconvenient.
IMO you'd be better off focusing on using SSL (SSL certificates are cheaper than ever!) to encrypt all traffic to your site (or at least your admin panel), and choosing a strong admin password, than you would trying to restrict access to your admin panel from unknown locations.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
Why every web browser interpret the web page different. Is it some standard for interpreting HTML, CSS or JavaScript or that depends of company witch development the web browser.
The browser is what interprets the html. The browser engineers do have a standard to go by, but in the end, they choose how their browser will interpret and display the html, css, etc, and how it will function.
There is a standard specification set by the World Wide Web Consortium. Most browsers follow it pretty well. Firefox, Opera, et. al. follow it pretty much to the letter but Internet Explorer does not in some cases.
actually yes it depends upon the interpretation of CSS and in turn, many tags.
This article provides some more insight on the matter.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Well, the title pretty much states the question...
SSL secures the communications, it does not provide content access mechanisms.
As long as there is no password/authentication restricting access to the pages, there's no reason a search engine would be unable to index them.
Yes.
They may choose not to spider over HTTPS, or they may choose to rate lower those sites that are available only over HTTPS, or they may choose to do any number of things. But they can certainly spider the Web over HTTPS just as easily as your browser can view a single Web page over HTTPS.