Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have been looking about the pros and cons of browsers specifically for security property. Please share if you know which browser is more secure than others and why it is so.
Each browser have different security features, vulnerability, maybe even NSA backdoors for some of them, at some point in time but... http://www.infosecurity-magazine.com/view/33645/there-is-no-single-most-secure-browser/
You might want to look here for additional insight : http://slashdot.org/story/13/06/23/0317243/ask-slashdot-most-secure-browser-in-an-age-of-surveillance
There is not web browser that is more secure than other in big margin, reason being is that most todays browsers use at most same standard. For example, usage of javascripts is allowed or disabled by default, tracking and sharing, your ip... Beacause this question does not have proper answer, here is example how to make web browser secure as much as possible if needed:
In this example I will use Mozilla Firefox.
First step is disabling javascripts in web browser (manually or by implementing some plugin to do that, for example "NoScript")
Disabling javascripts will disable viewing web pages properly or using them beacause almost any website today use javascripts. But we talk now about security.
Second step should be disabling tracking and sharing again, manually or by some plugin.
Third should be usage of some proxy server to hide your ip.
There is to many different things that could be done, also note again, javascripts, that are required for proper displaying page content and proper interaction with them on almost all modern websites, but can be big security hole, for example, session hijacking, forcing browser to get your geolocation and to many other things...
My reccomendation is to see first exactly, what you would like to protect, and then search on google how to do that.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm now introducing CSP and other security-related http headers to the website that I work on. They all feel like a walk-in-the-part to introduce so no problem there...
I quickly investigated what sites where using what http headers. Surprisingly extremely few sites where using CSP. I checked out some banks login-pages, some big websites and some technology-driven websites (like stackoverflow). Facebook was the only site I could find that used CSP. Gmail only runs it in report-only mode.
For me it feels like a low-hanging fruit to just add these headers and get all the security benefits. I feel confused. Have I missed something? Why are not anyone using it? Is there some kind of drawback that I don't know about?
People from Google and Mozilla where editors of the W3C spec. So why aren't even they using it?
I don't want to provide a link-only answer, but I don't know a better way to answer than Why is CSP failing? Trends and Challenges in CSP Adoption. Maybe citing Section 3.4, Conclusions, will add some substance:
While some sites use CSP as an additional layer of protection against
content injection, CSP is not yet widely adopted. Furthermore, the
rules observed in the wild do not leverage the full benefits of CSP.
The majority of CSP-enabled websites were installations of phpMyAdmin,
which ships with a weak default policy. Other recent security headers
have gained far more traction than CSP, presumably due to their
relative ease of deployment. That only one site in the Alexa Top 10K
switched from report-only mode to enforcement during our measurement
suggests that CSP rules cannot be easily derived from collected
reports. It could potentially help adoption if policies could be
generated in an automated, or semi-automated, fashion.
Unofficially, (or maybe officially, since Neil Matatal is with the CSP working group), from Managing Content Security Policy:
CSP Level 1
2 years of study
could not remove inline scripts
FAIL
CSP Level 2
two weeks
managed risk with script nonces
SUCCESS
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am working on a scraping project for a company. I used Python selenium, mechanize , BeautifulSoup4 etc. libraries and had been successful on putting data into MySQL database and generating reports they wanted.
But I am curious : why there is no standardization on structure of websites. Every site has a different name\id for username\password fields. I looked at Facebook and Google Login pages, even they have different naming for username\password fields. also, other elements are also named arbitrarily and placed anywhere.
One obvious reason I can see is that bots will eat up lot of bandwidth and websites are basically targeted to human users. Second reason may be because websites want to show advertisements.There may be other reasons too.
Would it not be better if websites don't have to provide API's and there would be a single framework of bot\scraper login. For example, Every website can have a scraper friendly version which is structured and named according to a standard specification which is universally agreed on. And also have a page, which shows help like feature for the scraper. To access this version of website, bot\scraper has to register itself.
This will open up a entirely different kind of internet to programmers. For example, someone can write a scraper that can monitor vulnerability and exploits listing websites, and automatically close the security holes on the users system. (For this those websites have to create a version which have such kind of data which can be directly applied. Like patches and where they should be applied)
And all this could be easily done by a average programmer. And on the dark side , one can write a Malware which can update itself with new attacking strategies.
I know it is possible to use Facebook or Google login using Open Authentication on other websites. But that is only a small thing in scraping.
My question boils down to, Why there is no such effort there out in the community? and If there is one, kindly refer me to it.
I searched over Stack overflow but could not find a similar. And I am not sure that this kind of question is proper for Stack overflow. If not, please refer me to the correct Stack exchange forum.
I will edit the question, if something there is not according to community criteria. But it's a genuine question.
EDIT: I got the answer thanks to #b.j.g . There is such an effort by W3C called Semantic Web.(Anyway I am sure Google will hijack whole internet one day and make it possible,within my lifetime)
EDIT: I think what you are looking for is The Semantic Web
You are assuming people want their data to be scraped. In actuality, the data people scrape is usually proprietary to the publisher, and when it is scraped... they lose exclusivity on the data.
I had trouble scraping yoga schedules in the past, and I concluded that the developers were conciously making it difficult to scrape so third parties couldn't easily use their data.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Can I use XAMPP for real to serve to WWW, not just my localhost? I see some warnings in some articles on internet not to do that and that XAMPP is for testing only and that hackers will screw it up... If so, what kind of SPECIFIC security holes and problems does it have that is not secure to serve for real?
I don't want some lose answers. I want SPECIFIC answer about the security holes or weaknesses of XAMPP. Thanks!
This is not an answer, more a long comment.
Here be Dragons:
The issue with the 'out of the box' XAMPP setup is that all the passwords are defaults and everyone knows them. You need to change every password. If you are not using certain services then disable them if you don't want to bother changing the password.I disabled DAV for this reason. I use XAMPP as an internet facing server and never have bother. I am on version 1.7.7. been using it for years.
If you are using it on a 'home' network with dynamic ip. If you want a domain name then you need to use a service that provides support for your ip address changing regularly. i use 'dyn' but there are others.
As #Braders has commented. Security is a major issue! Get it wrong and your server will be used for all sorts of nasties, both to your pc and others on the internet. I would suggest an external scan for security issues before you leave it permanently connected to the internet.
I set my server up a few years ago and i am starting to remember all the checks i made at the time. It took many days before i could 'trust' it. Lots of time looking at the access logs etc.
If you are not sure then do not do it. It is very easy to get the setup wrong.
The major issue with running any server is that you are making 'holes' in the firewall and that can be 'interesting' as to what comes in.
As was also mentioned by Braders, you really do need to check with your internet provider to ensure it is allowed by your agreement.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I was asking myself, wether it would be okay to enforce HTTPS over normal HTTP, by 301 redirecting every HTTP request to its HTTPS counterpart.
Are there backwards compatibility issues (IE, I'm looking at you) or any other drawbacks? How do search engines handle this? Do you already have experience with this? What are your opinions?
Google themselves also enforce HTTPS, but not always. If you're sending an IE6/7 User-Agent header, you won't be redirected. Should I allow my users to use HTTP, if they want to?
The Electronic Frontier Foundation understandably advises users to always use HTTPS. Can I make that decision for my users and enforce HTTPS? Is there a reason to not use HTTP at all?
Enforcing HTTPS is becoming more and more common. We started using HTTPS where I worked previously (site had millions of hits per week) due to the fact that Firefox was assuming HTTPS if no protocol is defined, meaning users could type "websitename.com" and not find our website at all, as we only served over HTTP.
I'm sure there were SEO implications behind redirects, but I seem to recall that 301 was the suggested route. Definitely not 302.
Internet Explorer didn't give us any issues for 8, 9 or 10 - prior versions I couldn't say. Hopefully someone else here will know more regarding IE7. There is a link here which explains a few issues, though: http://msdn.microsoft.com/en-us/library/bb250503(v=vs.85).aspx
Honestly in this day an age, the number of people using browsers which do not handle HTTPS are likely to be few - it's such a standard now. My opinion is that we need to try and progress things rather than build things around that total minority who refuse to get with the times. Technology is about progress, after all.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I've tested my page in Chrome (Mac), Firefox (all), Safari (all), IE9 (Win 7), and Opera (Mac). I'm also planning to do Chrome for Windows and Chromium for Linux, and Opera for Windows. Are there any others (for desktop) with wide enough usage that I should download and test with them?
I've always found the best policy is to test in IE, Chrome, Firefox and Safari (latest versions) then install a service like Google Analytics to find out what other browsers you are popular on. Testing across all different operation systems can be very time consuming and you might be surprised at the breakdown your site has for traffic.
Use this link to get a good feeling on browser usage trends:
http://www.w3schools.com/browsers/browsers_stats.asp
Keep in mind that these statistics are based on this particular site users, which are suspected to be a bit more 'tech-savvy' then normal users. If you click a specific browser you will see the statistics for the browser's different versions. for example, you can see that around 8% of the site's users use IE8, which is not in your list.
Also, use this site:
http://caniuse.com/
To check for support of specific HTML / CSS elements throught the different browsers.
Don't forget to provide your website users with proper fall back content for every element that's not supported in older browser versions.