I am using NextCloud to store some personal files. For security purposes, I decided to limit accessing post and couple other methods for my visitors.
The issue is if I want to share a file, and makes it password protected, it is also necessary to have 'post' access beside 'get' access for a visitor.
A shared password protected URL in NextCloud is sth like this:
https://MyDomainName.ltd/index.php/s/KwDGEW42xNfExA/authenticate/showShare
Here is my code:
SetEnvIf Request_URI ^/index.php/s/$1 allow-it
<Limit POST >
Order allow,deny
allow from MyIP [OR]
allow from MySecondIp [OR]
allow from MyThirdIP [OR]
Allow from env=allow-it
</Limit>
I want to allow post method for everything comes after /index.php/s/ beside full access for my own ip addresses that I currently have.
This (/index.php/s/) part of my url is not a real directory, so I can't go inside the directory to add my rules.
Thanks.
I resolved my own issue changing the code to:
SetEnvIfNoCase Request_URI ^/index.php/s/* allow-it
<Limit POST >
order deny,allow
deny from all
allow from MyFirstIP [OR]
allow from MySecondIp [OR]
allow from env=allow-it
</Limit>
If someone else has the same concern, you only need to give head and get access for downloading a file. Everything else can be limited using the same method.
Related
My website name is: cabinets.ga
I want to deny the access to all of my folders and files in my website, so this is my code to do that:
Order deny,allow
Deny from all
<FilesMatch "index\.php">
Allow from all
</FilesMatch>
This code works fine if the user write the link with the index.php like that : cabinets.ga/index.php but if he write only the domain name without the index.php like that: cabinets.ga it will give him (Forbidden)
So i want that if he enter both the domain name with the index.php or without it the website display the index.php without Forbidden it.. Any help please?
You can actually just make the filename optional in the regex (you don't need to use mod_rewrite). For example:
Order deny,allow
Deny from all
<FilesMatch "^(index\.php)?$">
Allow from all
</FilesMatch>
This will allow direct requests to index.php and also requests for the directory (no filename) ...which results in index.php (the DirectoryIndex) being served by mod_dir via an internal subrequest (which occurs later).
Note that you can't simply permit an empty filename (ie. "^$"). Whilst this allows the initial request for the bare directory, it will result in the internal subrequest for the DirectoryIndex, ie. index.php being blocked - so ultimately the request is blocked.
Note also that this allows access to all index.php files in all subdirectories and all directories that contain an index.php index document.
However, if you are on Apache 2.4 then you should be using Require instead, since Order, Deny and Allow are all deprecated on Apache 2.4.
Require all denied
<FilesMatch "^(index\.php)?$">
Require all granted
</FilesMatch>
UPDATE: My website is a single page application has just an index.php page controls all of my website with jquery ajax request, so i want when the user writed any other links accept my domain name accept the domain name the htaccess will redirect the user to the domain name
It sounds like you need to implement a front-controller pattern. The simplest form is using the FallbackResource directive. For example:
FallbackResource /index.php
Any requests that would otherwise result in a 404 are routed to /index.php. Any static resources (CSS, JS, images etc.) remain accessible and are not routed to /index.php.
I have an XML file that I want to be only accessed by users via my wordpress deployment - but not if they try and access the file directly.
Is that possible?
I tried the following but it doesn't appear to work:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^https?://my-domain.com/ [NC]
RewriteRule \.(xml)$ - [NC,L,F]
Any suggestions? Thanks in advance, love this community.
There are two ways:
Put the sensitive config file outside of the web root. The httpd follows the local file system hierarchies and privileges, not its own (virtual) virtual structure.
If your host does not allow this, you can restrict any HTTP requests to the files using .htaccess. Put the config file in a separate directory with a .htaccess file that blocks everybody (or a list of IPs, etc).
<Files important.xml>
Order allow,deny
Deny from all
</Files>
Since the web server still can read and execute the file PHP can access and process it too.
I have got a contact form on my website with file attachment as well, that has been restricted only to pictures. Although if I type in example.com/uploads/ all the files are accessible by anyone. Is htaccess the best way to hide it? Also how could I do that in a safe manner, without messing up the contact form?
I have tried this, but it blocks the whole website
deny from all
<Files ~ “^w+.(gif|jpe?g|png)$”>
order deny,allow
allow from all
</Files>
if I type in example.com/uploads/ all the files are accessible
You mean you get a directory listing? This can be disabled in .htaccess:
Options -Indexes
To actively block all HTTP requests for files in the /uploads directory (since you state in comments that these are only ever accessed over FTP) then all you need is (in your root .htaccess file):
RewriteEngine On
RewriteRule ^uploads - [F]
This will respond with a 403 Forbidden for all requests that start /uploads.
Just to block access to example.com/uploads/ you can place this rule in /uploads/.htaccess:
RewriteEngine On
RewriteRule ^/?$ - [F]
I have recently installed Solr on server and i want to restrict only local users can access it with .htaccess
site.com:8983/solr/admin [ restrict all user]
And below is the .htaccess code
RewirteRule on
<FilesMatch "127.0.0.1:8983/solr/admin">
Order Deny, Allow
Deny form all
Allow 127.0.0.1
</FilesMatch>
Or is there any method we can protect / restrict Solr Admin on site.com:8983/solr/admin accessing from other users
Only local ip users can use it..
And i tried this one, but its not working.
Your <FilesMatch "127.0.0.1:8983/solr/admin"> line will never match anything because you've stuck the hostname and port in the regular expression. Try using the Location container instead:
<Location "/solr/admin">
Order Deny, Allow
Deny from all
Allow 127.0.0.1
</Location>
Or better yet, Directory:
<Directory "/path/to/your/document/root/solr/admin">
Order Deny, Allow
Deny from all
Allow 127.0.0.1
</Directory>
You'll need to fill in the full path to the solr/admin directory.
Get rid of the RewirteRule on line, it doesn't do anything and it's not even spelled right and will cause a 500 error.
However, neither of these directives can be use in an htaccess file. You need to use these in either the server of vhost config. If you must use an htaccess file, then create an htaccess file in your solr/admin directory and simply put these directives in it:
Order Deny, Allow
Deny from all
Allow 127.0.0.1
Or, in the htaccess file in your document root:
RewriteEngine On
RewriteCond %{REMOTE_ADDR} !127.0.0.1
RewriteRule ^/?solr/admin - [L,F]
Check following links. Hope they will help you.
Restrict Solr Admin Access
Solr Security
Securing Solr administrative console
How to protect Apache Solr admin console
I was told that this is the right way to redirect anyone who is trying to open:
/users/username/something.txt
But i can't seem to get it work.
RewriteEngine on
RewriteRule \.txt$ /notallowed.html [F,L,NC]
Is this wrong?
The simplest way to deny users from all TXT files would be to use something like:
<FilesMatch "\.(txt)$">
Order Allow,Deny
Deny from all
</FilesMatch>
However, the code you have there should work for all intents and purposes. Depending on your server configuration, however, you may need to add "Options +FollowSymLinks".
If you decide to go the FilesMatch route, you can use ErrorDocument to control what page the user is taken to.