Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have an independent site and I am using Wordpress as Content Management System for that. My site has been hacked two times now. Pardon me, am a newbie, but can anyone guide me to how to protect it from being hacked? I will be really thankful.
Here are some links, maybe they are helpful for you:
http://www.mastermindblogger.com/2011/14-ways-to-prevent-your-wordpress-blog-from-being-hacked/
http://semlabs.co.uk/journal/how-to-stop-your-wordpress-blog-getting-hacked
http://getinternetmarketingstrategies.com/2010/04/how-to-secure-wordpress-blogs-prevent-the-hacking-of-your-blog/
http://blogcritics.org/scitech/article/10-tips-to-make-wordpress-hack/
http://tek3d.org/how-to-protect-wordpress-blog-from-hacks
There is also a plugin, which backups your wordpress data into your dropbox account !
But you could specify what you understand by hacked ? Got it deleted, spam comments ?
Here are some links , check it out.
http://wordpress.org/extend/plugins/wp-security-scan/
http://www.1stwebdesigner.com/wordpress/security-plugins-wordpress-bulletproof/
http://designmodo.com/wordpress-security-plugins/
And also keep ASKIMAT plugin activate , it saves your wordpress site from spam e-mails.
Good luck.
This is a new kid on the block but the are getting some impressive reviews.
cloudsafe365.com - a free wp plugin that prevents hacking and content scrape.
Apparently they even clean you dirty dishes.
Insure correct File and Directory permissions.
No 'admin' user
Refresh auth and salt values in wp-config
Use complex passwords
If you did not completely remove (or rename) your old site directories, you may be leaving the hacker's backdoor intact.
Completely delete any unused plugin and theme directories.
Check your web access logs for hackers fishing for exploits.
Cheers
Security is mandatory for every websites, you can try this following ways for strong protection
Disable file browsing via .htaccess file
Use plugins like limit login attempts especially this is for limit brute force login attempts. You can completely kick off brute force logins by changing WordPress login url
Always up to date with WordPress and plugins
Don't use poor coded plugins or themes.
Use plugins like securi for monitoring whole site from malware.
Don't use pirated themes or plugins
Tons of security plugins available on WordPress plugins repository for every security vulnerabilities problems.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm just after a bit of advice please?
I have a company that has rebranded recently. There is currently two seperate domains set up for them where the content is fundamentally the same apart from where the company name is mentioned.
The newer site has therefore been created with the new branding with the intention of taking down the old site at some point. However, I've always been reluctant to do this as the old site does very well for particular keywords (probably because of the age)
I've read a few things but just wanted to ask what is the best way to go about decommissioning the old site? Is it a case of going through 301 redirects. If the original domain ceases to exist will these be read?
Thanks
I think 301 redirects are definitely the best place to start off – it is an easy way of letting the Google spiders know that they should travel to your new site instead, which means you will still have the benefits of the old site for keywords, but they will move to the new content you are setting up.
But the downside of this, of course, is that if you completely take down the old site, you get nothing, and the same with if you don’t maintain its SEO updates.
This ends up being a lot of hassle, so what we tend to do is we go through an overlap period of a few months so the new site can be better established and then remove the old one.
While you are doing that, you want to be moving your links over too – so contact web masters and get them on board with the move so that you can keep all that ‘link-juice’ flowing.
Ultimately though, the age of your website does have a bit of an impact on your SEO, but if you are starting from scratch with the new one, you can craft it with SEO in mind and make it more attuned to it right from the outset.
If you fail to implement proper redirects when migrating a website from one domain to another you will instantly lose any traffic to your website you currently enjoy.
It is also important to convey your old domain’s search engine rankings to the new web address – or your website will effectively be starting from zero visibility, in search engines at any rate.
Buying old domains and redirecting them to similar content shouldn’t be a problem. Even when redirecting pages, I usually make sure page titles and content are very similar on both pages if I want to keep particular rankings.
301 Redirects are an excellent 1st step. When search spiders crawl your old site, they will see & remember the redirect. Because of this, the relevance of your old site to certain search terms will be applied to your new site.
Great. Now that you've handled traffic coming in from search engines, what do you do about other sources of traffic? Remember that 301 Redirects will only work on non-search-engine traffic for as long as you maintain the old site...
Next you'll want to contact the web masters of any sites that link to your old site & inform them of the change. This way when you retire the old site their links don't go dead, loosing you traffic. Keep an eye on the "Referrer" field in your logs to see who is currently linking to you.
Lastly, you'll want to keep the old site doing redirects for a while longer so that folks who have bookmarked your old site will have the redirect cached by their browser. "How long?" you ask... Well I'd keep an eye on the web logs of the old site. When the non-spider traffic drops off to near 0, you'll know you've done your job right...
301 is the best way i could think of.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Tumblr allows users to edit the HTML and CSS of their blogs through a Templating system. In fact, it even allows users to include their own external scripts into their pages. Doing this obviously opens a lot of security holes for Tumblr; however, it's obviously doing fine. What's even more interesting is how Tumblr's team managed to do all these through a LAMP Stack before.
One thing that I found out through pinging different Tumblr sites is that the blog sites are dispersed through multiple servers.
Nonetheless, hacking the root of just one of those servers could compromise a ton of data. Furthermore, while some could argue that Tumblr may just be doing manual checks on each of its blogging sites, it still seems pretty risky and unpractical for Tumblr's team to do so because of the amount of data that Tumblr has. Because of this, I think there are still some aspects that checking manually hasn't covered yet, especially in terms of how Tumblr's team filters their user input before it enters their database.
My main question: How does Tumblr (or any other similar site) filter its user input and thereby, prevent hacking and exploits from happening?
What is Tumblr.
Tumblr is a microbloggin service, which lets its users to post multimedia and short text blogs on their website.
Formating and styling blog
Every blog service lets its user to edit and share the content. At the same time they also let their users to style their blog depending on what type of service they are providing.
For instance, A company blog can never have a garden image as its background and at the same time a shopkeeper can never show a beach image; unless they are present at that place or include such objects in their work.
What Tumblr. does
Well, they just keep checking the files for any error!
As a general bloggin platform. It is necessary to allow the users to upload and style them blogs. And at the same time it is a job for the company to keep the control of how their service is used!
So Tumblr. keeps a great note on these things. They also donot allow to upload files that infect the system, and are well-known to delete such accounts if anything fishy is caught!
Tumblr. allows the users to upload files and multimedia that is used to style the blog. They used a seperate platform where to save all such files! So when you upload it, it does not get executed on their system. They access it from the server or from the hard drive which these files are saved on and then provide you with the blog that includes those files.
What would I do
I would do the same, I would first upload and save the files on a seperate place, where if executed they donot harm my system if are infected by a virus. Not all the users upload virus. But once they do, you should use an antivirus system to detect and remove the virus and at the same time block that account.
I would have let the users to use my service, now its user's job to upload content and its my job to prevent hacking.
All this stuff (HTML/CSS/External scripts) does not run on Tumblr machines. So to them it does not matter. One is responsible for the stuff that runs on your own PC. As to Javascript it lives in a sandpit
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am new to PHP and trying to learn if there is a way to catch websites that install programs in to your computer without your authorization. For example, when you visit some websites, your computer might catch a virus just by going to that web page. Just by looking at its html code, is there a way I can see if a webpage is trying to install something in to my computer? Any help would be greatly appreciated.
You are fundamentally mistaking about the concept of "infecting a computer" via a website.
Usually an attacker would use an exploit to target certain browsers, this will load a "payload" and from there the computer is powned. This "expoit" could be anything from crafted JavaScript to malicious flash files. This is a direct manner of infecting a computer, note that this is not effective unless you don't have an antivirus, up to date browser/software or the attacker is using a 0-day exploit.
The effective way an attacker could infect his visitors is by letting them download something and infecting them directly. Note that a website can't just install something on your computer unless the user downloads it and manually installs it.
It sounds like an anti-virus program is the solution, but how do they detect malicious code ?One of the techniques they use is scanning for certain "signs" of a program/code. The AV has a database of those signs, and scans against it.
To answer your question, it may be possible to do it with PHP but it's like using a fork to dig a cave. Note that you will need to develop a method to detect malicious code, this can be done by comparing hex codes(signs), you'll need a full database of it. And the most fun part is, the attacker could just change slightly his code and your scanner will fail. Also obfuscated code will let your scanner fail.
That's why one should never even think about building a virus scanner with PHP. Use an antivirus. They are smarter, faster and the people working behind it are hackers. Just one technique of my head they use heuristic analysis.
To run code without your consent (or install malicious software) in context of the whole system (not just web application / browser), coders use known or unknown bugs in browsers. Example of Javascript exploit: Help me understand this JavaScript exploit. My antivirus tries not to let me on that page ;)
To check with php if given page contains malicious code, you'd need to use php-based antivirus or one that has php bindings / lets scan files on demand from command line and works against web-based (html,css,js-based) malware.
Not really, antivirus, antispyware and that sort of software does that for you.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am looking to hide a site so that it doesn't show up in any search engine just wondering how I would go about this?
Use a robots.txt file: http://www.google.com/support/webmasters/bin/answer.py?answer=156449
Apart from password-protecting your site, you could add these lines to robots.txt:
User-agent: *
Disallow: /
This doesn't hide the site but rather instructs bots not to spider the content.
You can somehow reduce your site being listed using a robots.txt. Note that this depends on the "goodwill" of the crawler, though (some spambots will explicitly look at locations that you disallow).
The only safe and reliable way of not having a site listed, sadly, is not putting it on the internet.
Simply not linking to your site will not work. Crawlers get their info from many sources, including browser referrers and domain registrars. So, in order to be "invisible", you would have to not visit your site and not register a domain (only access it via IP address).
And then, if you run your webserver based on IP address, you still have all the spambots probing random addresses. It will take a while, but they will find you.
Password protecting your site should work, effectively making it inaccessible. Though (and it is beyond my comprehension how that happens) for example there are literally thousands of ACM papers listed in Google which you cannot see without an account and logging in. Yet they are there.
Use a robots.txt, deny from all search engines.
They don't all respect robots.txt so check your server logs regularly and deny from ranges of suspected robots/crawlers:
http://httpd.apache.org/docs/2.2/howto/access.html
You use a robots.txt file. Place the file in the root of the site with this content:
User-agent: *
Disallow: /
most proper search engines uses bots or crawlers to websites and index them. you could Robot File method
Have a look at nofollow Wikipedia
You need to read about robots.txt file you are supposed to copy in your site's webroot – http://www.robotstxt.org/robotstxt.html.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a WordPress site and the following link is accessible: www.domain.com/wp-admin/ (obviously not the real domain name). Someone told me that this is a security risk. Any truth to this?
In essence, the more information an attacker has about your setup, the worse off you are.
That being said, however, the information gained by knowing your admin login page is pretty trivial - as it's the default login location for all WordPress sites. Therefore, once an attacker figured out your site was a WordPress site, he/she would naturally try that link.
As long as you keep your WordPress files up to date, the only thing you're really vulnerable (that you would be protected from if that page was inaccessible) to is an 0day on that specific page...
So, really, it doesn't matter much either way. Personally, I would deny access to that as much as was convenient - but, on the other hand, you may like having that link always open so you can login and admin your site from anywhere. I dare say you'll be fine either way, so long as you have sufficiently strong passwords.
Update: Another thing to consider, the login pages of (well-written, tested)open-source software are rarely ever the point of failure for authentication attacks. Usually, compromising a system involves disclosure of credentials using another vulnerable page, and then using the login page as it was intended to be used. The WordPress devs have combed over the code in your login page because they know it's going to be the first place that anybody looks for an exploit. I would be more concerned about any extensions you're running than leaving the login page viewable by the public.
That's simply Wordpress. Nothing inherently wrong with it. But if you are concerned overall with security, see http://codex.wordpress.org/Hardening_WordPress and http://www.reaper-x.com/2007/09/01/hardening-wordpress-with-mod-rewrite-and-htaccess/ and http://www.whoishostingthis.com/blog/2010/05/24/hardening-wordpress/ etc., on protecting admin with .htaccess, removing some WP identifiable clues, changing the database prefix, SSL access, and on and on. Some things are more worthwhile to do than others, some more obscurity than security, but it's all a learning experience.
Well a lot of sites have open wp-admin , however you can put in a .htaccess file and password protect the directory, provided you are on apache.
it's not a big deal... there's a lot of stuff to avoid it being there... you could even have your whole wp install in a subdirectory of the server
Not sure for WordPress, but I know at least two e-commerce softwares (Zen Cart and PrestaShop) recommending to rename the admin directory to some other name (and not to print the URL in orders...).
Perhaps there are some known exploits using this information...