.htaccess denied vs out of wwwroot - linux

I need to upload to my server a high sensitive data file to be used by PHP scripts. Please, tell me what's the most secure way and why:
Putting it in a folder not under www root
Putting it under www root but denying access with .htaccess rule
Thank you very much

If you have ftp/sftp/ssh access to this server, there's no reason not to put it outside of the www document root. If you have a hosting service that only grants you access to the www document root, then you'll have to go with the second solution, but the first is much more secure.
Htaccess can be bypassed if there is a vulnerable script on your site. There are tools that, if placed correctly, files inside the document root and be uploaded by a remote attacker or even replaced. For example, there are php "remote file managers" that allows a remote attacker to change permissions, edit, or replace existing files, including your htaccess file. If you're running a site like Wordpress, or some other CMS, that aren't exactly super secure by themselves, that have a lot of third party plugins, those plugins could be vulnerable to attacks, and if you happen to be using one, your htaccess file's access restrictions could be bypassed.
When the sensitive information is outside of the document root, an attacker who has access to the document root won't be able to access these files and vulnerabilities in your scripts are a lot less likely to affect access to these files. There most likely needs to be a system level exploit to be able to gain access to files outside of the docuement root.

Related

How can I make my domain secure, and invisible?

My problem is that if I type my domain without any slashes, it shows the complete folder and file structure, so we can say it is not really secure. I've managed to encrypt the folders, but not the main domain. In the cPanel I dont't find where can I make it secure, and invisible. Please don't devalue me for this question, I'm new.
If you want to prevent users from viewing the directory listing, you should select the no indexing option in cpanel. See this link for more information: http://www.inmotionhosting.com/support/edu/cpanel/remove-index-listing.
You can also password protect your domain using .htaccess and .htpasswd

Change Joomla Administrator URL

Update:
Since this question was asked Joomla StackExchange has been setup and the same questions exists there please add any answers or comments to that question
Original:
I am using Joomla 3.0.3 for a fairly big new client, security is a must. I therefore decided to try change the Administrator URL, normally
example.com/administrator
changed to
example.com/newadminurl
Reason being if the folders aren't where potential hackers expect that is the first hurdle before they can even try anything else.
However that has now meant whenever I go to the new URL it brings up a 403 error. I have tried searching if there is a global config setting I need to change but can't find anything on the web or Joomla site. Anyone know how to change this deep down in the source code?
Step 1. Create a new directory in your root directory (eg. "newadminurl")
Step 2. Create an index.php file in your "newadminurl " directory..
$admin_cookie_code="3429020892";
setcookie("JoomlaAdminSession",$admin_cookie_code,0,"/");
header("Location: /administrator/index.php");
?>
Step 3. Add this to .htaccess of your real Joomla administrator directory
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/administrator
RewriteCond %{HTTP_COOKIE} !JoomlaAdminSession=3429020892
RewriteRule .* - [L,F]
Explanation:
Now, you need to open "http://yoursite.com/newadminurl/" before you open your “administrator” path. Here we have created a cookie that expires at the end of the session and redirect to actual administration page. Your actual “administrator”path is inaccessible until you don’t open on your secret link .
I hope this is what you were looking for.
While there are hacks around that do this, they introduce new security issues as the Joomla! core isn't built to work this way.
In fact the it is common practice both in the core and in 3rd Party extensions and templates to load models, controllers and other assets from /administrator.
The best practise is to secure your site is:
Keep your Joomla! installation up-to-date (the most common cause is outdated installs)
Don't hack core files, if you need extra functionality duplicate the core component and extend that, not the core.
Add a realm password /administrator
A secret word on the /administrator url e.g. /administrator/?s3cr3tpa55w0rd
An ip whitelist that only allows on select IP addresses to access /administrator
Use unique and strong passwords
Don't share passwords even with your significant other...
Enact a password policy on your site.
Keep a tested and regular site backup in an off-server storage location.
Run a file scanner to help you detect a hack so that you're aware of where your last good back was taken.
You can find extensions that do one or several of these things for you in the Access & Security section of the Joomla! Extension Directory (JED), and for integrated backup to cloud or other storage you can't go past Akeeba Backup (and personally for the tiny fee compared to the cost of my time we always go with the Pro versions).
In fact Akeeba's Admin Tools Pro (included in any of their subscriptions) also provides most of the features on that list through it's WAF (web application firewall). The only area not covered is Password Management of which there are several solutions available.
There might be all sorts of dependencies in the core and in third party extension that will hard code the admin path, even though there are platform variables to assist this.
I would recommend that you instead configure your .htaccess to prevent public viewing of your administrator folder and restrict access only to approved IP addresses. This will prevent them from accessing the admin folders, but of course will not protect against attacks which do not require direct access (e.g., some third party app that calls code in an admin folder for the component from the front end).
Note: This goes in the .htaccess file in your administrator folder not the .htaccess in the site root, i.e. [siteroot]/administrator/.htaccess
Here is an example of the .htaccess you may configure:
ErrorDocument 403 http://www.your-ip-is-not-allowed-to-access-this-section.com
Order deny,allow
Deny from all
Allow from X.X.X.X
Where X.X.X.X. if the IP address you want to allow to the admin section. You can specify multiple addresses with multiple Allow from X.X.X.X lines.

My .htaccess is changed over and over and over

I hope you can help me, I have a website and constantly the .htaccess file is hacked to redirect to another page, every time I delete that file, when I check the file after 5 minutes, again the file is written to redirect to a page with malware, I changed passwords of sftp, the page itself and the database several times from different computers with windows and linux but this file is constantly changed in the main page and creating hacked .htaccess in the subdirectories, Why does this keep happening? HELP
The web page is hosted in Dreamhost.
If the permissions on your .htaccess file are set so that only you can modify it, then you will find one of the following:
An entry in your FTP access log showing .htaccess being uploaded
An entry in the control panel access log showing the .htaccess being edited
An entry in your HTTP access log at the time that the modification happens (often a POST, but not necessarily). This is often to a generalised backdoor process of some sort.
A crontab entry that makes this modification
Additionally, you will find that your site was hacked somehow - e.g. insecure version of JCE editor, poor passwords, nonumber extensions, flash uploader, failure to update for known security problems, or similar. It's all in the logs. You will also find that there are a stack of little PHP files or an extra admin account which will let the attacker back once you sort out the obvious part of the problem.

Dynamic .htaccess subdomain security

I have yet another .htaccess question, simple for you, not so much for me.
Let's say my main site is found at http://domain.com. I do all of my pre-release testing at sandbox.domain.com and sandbox.domain.co. I just realized that Google has gone ahead and indexed my sandbox sites... Ugggh!
The document root folder on my Apache server with the live site is always called ALIVE, and in order to make my sandbox contents live I quickly rename the folders, ie ALIVE->OLDx, SANDBOX->ALIVE.
My goal is to prevent indexers and users from accessing my sandbox pages. I am trying to design a .htaccess file for document root that only allows my ip address when accessed from a sandbox subdomain (sandbox.domain.com), otherwise it allows everyone when accessed from the main domain (domain.com). This would eliminate the process of remembering to update the .htaccess file each time I release a new site.
This doesn't seem too difficult, but I haven't been able to find the right combination. Any pointers in the right direction will be much appreciated!
Create a .htaccess inside each folder (like sandbox):
order deny,allow
deny from all
allow from YOUR IP HERE

Secure file downloads in dotnetnuke

I'm relatively new to dotnetnuke and am trying to set up a simple site which will have multiple user groups with their own set of files and then another user that has access to all files.
I'm currently playing with doing this with the "documents" module and hiding the module from all but the everything user and the specific company user. This works fine but the security seems to be just security by obscurity.
If I log in as User A and get access to file A and copy its url. I then log out and log in as user B who can't see that file. If I then put the file url into the browser it seems to download fine.
Can anybody tell me if I am doign something wrong or is there no actual user based security on file downloads? I've tried goign to the actual file manager and making the directories explicitly not viewable to user B (they are secure directories too) but still it persists. Am I missing a permissions option at the file level somewhere or is the security designed to just prevent you finding the right links to the files? I'll admit the links aren't guessable (no sequential ids in the url or anything silly like that) but I'm still a little uncomfortable with the security working like this...
DNN FileManager Module
Hi Chris,
Please check out the FileManager module per above link. You are correct that the current FileManager module does not allow access per user roles. You might check Snowcovered for possible substitutes?
It seems that I was doing something wrong. I was referencing a different version of the file which didn't have any permissions attached to it. It seems also that I don't need to have multiple documents modules since if a file doesn't have read permission it will just be hidden in the list.
So to summarise the DNN Documents module will do role based security to prevent unauthorised users from downloading the file and from seeing it in the documents view.
Documents module provides security for LinkClick.aspx urls that are routed to ASP.NET.
If the actual files reside in the file system under the site's root folder, direct urls to these files are served and secured by IIS.
To prevent unauthorized access to direct urls you can disable anonymous authentication and set up Basic authentication with NTFS permissions, for example.
If don't want to touch IIS and administer Windows accounts, you can't store the files directly under any publicly available IIS folder. Security at the ASP.NET application-level is implemented using file encryption or storing the files outside the public IIS folders, like in the database. DNN File Manager offers both of these options: secure folders in the file system and secure folders in the database.
There are also 3rd party modules to manage file security and sharing, like NukeTransfer.

Resources