everyone.
I need to lock website for downloading via some windows tools and wget.
The site consists of js, html and php files.
I googled about security resource sharing, but it did not helpful for me.
Thank you.
As long as at the same time you need to have your website online available for everybody, this is not possible. If someone visits your site, the browser needs to access all files, in other words download them. You might be able to apply a few hacks to make it more difficult, but you can not prevent it completely.
If you want to restrict it to a defined audience, you can implement a login using for example HTTP Auth. How this can be achieved depends on your hosting. It might be doable using an .htaccess file in your web root or maybe through the admin interface of your hoster.
Your PHP file should be safe by the way, the above said applies to the public parts of your site (HTML/CSS/JavaScript/Images/...).
Related
I have a website which is infected by some malicious malware. In the beginning I could notice that there was some strange javascript code on the site pages so I delete it and everything was fine for a few days, but now google lists the website as dangerous even though that I have checked the site code for any strange code but I could not find anything.
I have try Sucuri SiteCheck and it detects redirections to a malicious site. At first I thought that it may be an .htaccess file that was doing the redirection but I checked the files on the shared server and there is no .htaccess file.
Any ideas on how to solve this?
Your hosting account has bee hacked. Change your password on your hosting service. Go through your site code once more (every file) and look for things that don't belong. Clear your browser cache and then try again. If your account is hacked again, find a new hosting service. Once you're sure that your site is clean and your account has been secured, let Google know about the problems and request a removal from their suspect list:
Google support
check your .htaccess file for the redirection or the whether the files contain and unwanted malicious java script.
Right I'll try and explain my situation as thoroughly as possible while also keeping it brief...
I'm just starting out as a web designer/developer, so I bought the unlimited hosting package with 123-reg. I set up a couple of websites, my main domain being designedbyross.co.uk. I have learnt how to map other domains to a folder within this directory. At the minute, one of my domains, scene63.com is mapped to designedbyross.co.uk/blog63 which is working fine for the home page. However when clicking on another link on scene63.com for example page 2, the URL changes to designedbyross.co.uk/blog63/page2...
I have been advised from someone at 123-reg that I need to write a .htaccess file and use the RewriteBase directive (whatever that is?!) I have looked on a few websites to try and help me understand this, including http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html however it all isn't making much sense at the moment.
Finally, scene63.com is a wordpress site, whether that makes any difference to how the htaccess file is structured I'm not sure...
Any help will be REALLY appreciated - Thanks.
I run my personal public website on Webfusion, which is another branded service offering by the same company on the same infrastructure, and my blog contains a bunch of articles (tagged Webfusion) on how to do this. You really need to do some reading and research -- the Apache docs, articles and HowTos like mine -- to help you get started and then come back with specific Qs, plus the supporting info that we need to answer them.
It sounds like you are using a 123 redirector service, or equivalent for scene63.com which hides the redirection in an iframe. The issue here is that if the links on your site use site-relative links then because the URI has been redirected to http://designedbyross.co.uk/blog6/... then any new pages will be homed in designedbyross.co.uk. (I had the same problem with my wife's business site which mapped the same way to one of my subdirectories).
What you need to do is to configure the blog so that its site base is http://scene63.com/ and to force explicit site-based links so that any hrefs in the pages are of the form http://scene63.com/page2, etc. How you do this depends on the blog engine, but most support this as an option.
It turned out to be a 123-reg problem at the time not correctly applying changes to the DNS.
We have our application stored on our server, it is an .exe file. The download page is only accessible from our site - using cookie authentication in PHP. I know there are better methods but there is a long story behind this...so I'm moving on. The issue is that the actual url of the .exe has been leaked and is appearing on other websites. What is the best method to protect a link to a file, not the page itself. That is where I'm having issues. I can make it difficult to get to the download page (with the link) but don't know where to begin to make sure the link is only accessible from our site... Is .htaccess (preventing hotlinking) the best way to go?
Yes, .htaccess is probably best. Find any online post about protecting images from hotlinking, the first in my google search looks like a nice and easy auto-generator you can use. Just change the image extensions to exe, or keep them if you want them protected too.
If I buy a hosting (+ domain) service for the website of a friend of mine, and then I decide to use the remaining web space and mysql databases for my development and test...
is google caching my development websites (in other folders and sub-urls) under his website ?
What's the downside to develop on a server with already a production website.. ? I was thinking to create a tiny url linking to a www.myfriendwebsite.com/mydevelopmentSite.. in order to hide the real url.
Thanks
If you don't link to it or don't submit to google or list in a sitemap -- google won't find it.
But, you could also just use a robots.txt to tell google not to index it.
http://en.wikipedia.org/wiki/Robots_exclusion_standard
Update: to stop google and malicious bots:
Put a directory in robots.txt using *, and then put your site in a hard to guess subdirectory of that directory -- also, don't keep directory browsing on.
Also -- don't link to it anywhere, but perhaps you can't stop others from linking -- in that case, only robots.txt will keep you out of google. Malicious bots can get the site from the link.
Your hosting provider may have forbidden that in his Terms of Service (mine has). Other than that, I'd go for a subdomain instead of a subdirectory (like mydevelopmentsite.myfriendswebsite.com).
Now I didn't do the website design but a couple of months ago I ported an existing website over to wordpress for a client of mine.
I got a call from a client today regarding their website, and some sort of a security problem.
The websites homepage loads up fine, but if you try to navigate to any other page it brings you to - http://secure.wheelerairservice.com/main.php.
The nav appears to still be linking to the appropriate page (when you rollover contact us, the link displays in the status bar as /contact-us) but it redirects to the above url.
Just wondering if anyone knows what the problem is, and who or what might have done this and how.
Any suggestions on how I could fix this?
thanks!
Ok I've looking into the problem some more and found that the .htaccess file had been replaced somehow. I'm just wondering how someone might have done this? via ftp access, wordpess admin account or some hole in wordpress, any thoughts?
Typically when it's the .htaccess files that have been infected, it's usually the result of stolen (compromised) FTP credentials.
This usually happens by a virus on a PC that has FTP access to the infected website. The virus works in a variety of ways, but usually one of two.
First, the virus knows where the free FTP programs stores it's saved login credentials. For instance with FileZilla on a Windows XP PC, look in:
C:\Documents and Settings(current user)\Application Data\FileZilla\sitemanager.xml
in there you'll find, in plain text, all the websites, usernames and passwords that user has used FileZilla to access via FTP.
The virus finds these files, reads the information and sends it to a server which then uses them to login to the website(s) with valid credentials, downloads specific files, in this case the .htacces files, infects them and then uploads back to the website. Often times we've see where the server will also copy backdoors (shell scripts) to the website as well. This gives the hacker remote access to the website even after the FTP passwords have been changed.
Second, the virus works by sniffing the outgoing FTP traffic. Since FTP transmits all data, including username and password, in plain text, it's easy for the virus to see and steal the login information that way as well.
Change all FTP passwords immediately
Remove the the infection from the .htaccess files
Perform a full virus scan on all PCs used to FTP files to the infected website
If the website has been listed as suspicious by Google, request a review from Google's webmaster tools.
If the hosting provider supports it, switch to SFTP which encrypts the traffic making it more difficult to sniff.
Also, look at all files for anything that doesn't belong there. It's difficult to find backdoors, because there's so many different ones. You can't go by the datetime stamp either because these backdoors modify the datetime stamp of files. We've seen infected files with the exact same datetime as other files in the same folder. Sometimes the hackers will set the datetime stamp to some random earlier date.
You can search files for the following strings:
base64_decode
exec
fopen
fsock
passthru (for .php files)
socket
These are somewhat common strings in backdoors.
Change your passwords. See Hardening WordPress and FAQ: My site was hacked « WordPress Codex and How to completely clean your hacked wordpress installation
If FTP has been used to access/modify the files in this wordpress site, then it could be more than possible that someone has got the username and password for FTP access and modified your .htaccess file. FTP is not secure at all. I would suggest using SFTP as a minimum.
Wordpress is not perfect (not many things are) but i highly doubt there would be a flaw like this, is possible but i very much doubt it.
I suggest you first, change your FTP username/password, upgrade wordpress to the latest version, change the default admin username to something else and change the password for the administrator user, ensuring that all passwords are at least 8-10 characters in length
We also getting same problem for word press website, once virus removed but it re-attacking again, So as said above first have to backup all files, then change passwords of FTP, Administrator and cPanel, then upload back the website. I did above steps for our website.