Payment gateway blocked by mod_security when trying to request Woocommerce endpoint - .htaccess

my payment gateway is blocked by mod_security when trying to access Woocommerce endpoint.
receiving 403 permission denied when trying to access the "/wc-api/my_gateway_payment_callback" endpoint.
im on an Litespeed shared host.
when disabling the mod_security from .htaccess
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
it solves the issue but exposes Wordpress admin to attacks, so i want to be more specific.
i tried to add a LocationMatch
<LocationMatch "/wc-api/my_gateway_payment_callback">
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /wc-api/my_gateway_payment_callback/" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
</LocationMatch>
or
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /my_gateway_payment_callback" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
but they dont work and im still getting the 403 error.

I can spot multiple problems here:
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
Are you really using ModSecurity v1? That is VERY old and suggests you are using Apache 1 as ModSecurity v1 is not compatible with ModSecurity v1. If not this should be:
<IfModule mod_security2.c>
SecRuleEngine Off
</IfModule>
Next you say:
it solves the issue but exposes Wordpress admin to attacks
I don't see how it can solve the issue unless you are on REALLY old software, so suspect this is a red herring.
so i want to be more specific. i tried to add a LocationMatch
Good idea to be more specific. However LocationMatch runs quite late in Apache process - after ModSecurity rules will have run so this will not work. However you don’t really need LocationMatch since your rule already scopes it to that location. So let’s look at the next two pieces:
SecRule REQUEST_URI "#beginsWith /wc-api/my_gateway_payment_callback/" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
SecRuleRemoveById 3000
You shouldn't need to remove the rule if you allow it on the previous lines. Typically you would only do one or the other.
or
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /my_gateway_payment_callback" > \
"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
but they dont work and im still getting the 403 error.
You have pass (which means continue on to the next rule) and allow (which means skip all future rules). It seems to me you only want the latter and not the former. As these are conflicting, I suspect ModSecurity will action the former first hence why it is not working.
However the better way is to look at the Apache error logs to see what rule it's failing on (is it rule 3000 as per your other LocationMatch workaround?) and just disable that one rule rather than disable all rules for that route.
So all in all I'm pretty confused with your question as seems to be a lot of inconsistencies and things that are just wrong in there...

Related

Is this .htaccess rule valid?

I was doing a CTF on HackTheBox and came across an .htaccess rule that is puzzling me:
SetEnvIfNoCase Special-Dev "only4dev" Required-Header
Is it valid? And if so, what is it supposed to do? I am most puzzled by the Required-Header at the end of the line.
The above rule is the first in the .htaccess file. For reference, here is the rest of it:
Order Deny,Allow
Deny from All
Allow from env=Required-Header

Apache2.4 htaccess - negative ifModule executes regardless

Consider the following Apache conf directives inside .htaccess:
ErrorDocument 403 /dbug.html
<IfModule !mod_php5.c>
Require all denied
</IfModule>
It denies access even though mod_php5 is active, but ignores the ErrorDocument
If i remove the ! it triggers the ErrorDocument 403 - which it should, but this is backwards, and wrong ..
Any answer/advice would be appreciated, thanks.
In some shared-hosting environments, live-module-version-switching (hot-switch) is available; as a result, the main module-name (module-handler) may be different than the target-module-name, but the target-module will show up as being present only when it is called by the module-handler.
The solution is to track down the name of the module-handler-name and reference that instead .. contact the hosting provider. In this case, the module-handler-name is: mod_php_null (Hetzner); so <ifModule !mod_php_null.c> will work as expected -BUT- to set directives for the target-module, use the target module-name; so then <ifModule !mod_php7.c> will work as expected.
If there is no "module-handler" for such a module, then referring the target module directly in both conditions should work as it gets loaded upon server daemon startup.
The question was also posted on Server Fault, to avoid duplication -the .htaccess code is posted there.

What exactly does the Multiviews options in .htaccess?

I've been struggling a lot with an access rule that needed to rewrite one piece of URL adding a path.
RewriteRule ^(configuration/.+)$ application-server/$1 [L,NC,R=301,NE]
This Rule caused just a blank page on my Joomla site with no error log or messages.
The curious thing is that all other rules I had worked perfectly:
RewriteRule ^(log/.+)$ application-server/$1 [L,NC,R=301,NE]
RewriteRule ^(monitor/.+)$ application-server/$1 [L,NC,R=301,NE]
in the end, I've found in a forum a suggestion to use the following option:
Options -Multiviews
That actually solved the issue, however I wonder if there can be any side effects on other Rules when using this option.
This is about Apache content negotiation.
A MultiViews search is where the server does an implicit filename pattern match, and choose from amongst the results.
For example, if you have a file called configuration.php (or other extension) in root folder and you set up a rule in your htaccess for a virtual folder called configuration/ then you'll have a problem with your rule because the server will choose configuration.php automatically (if MultiViews is enabled, which is the case most of the time).
If you want to disable that behaviour, you simply have to add this in your htaccess
Options -MultiViews
This way, your rule will be now evaluated because content negotiation is disabled.
Edit
On some shared hostings, the negotiation module might not be enabled. That would give you then a 500 error. To avoid this error, you can, by default, encapsulate the directive in an IfModule block.
<IfModule mod_negotiation.c>
Options -MultiViews
</IfModule>

X-Robots noindex specific page in .htaccess

Can I 'noindex, follow' a specific page using x robots in .htaccess?
I've found some instructions for noindexing types of files, but I can't find instruction to noindex a single page, and what I have tried so far hasn't worked.
This is the page I'm looking to noindex:
http://www.examplesite.com.au/index.php?route=news/headlines
This is what I have tried so far:
<FilesMatch "/index.php?route=news/headlines$">
Header set X-Robots-Tag "noindex, follow"
</FilesMatch>
Thanks for your time.
It seems to be impossible to match the request parameters from within a .htaccess file. Here is a list of what you can match against: http://httpd.apache.org/docs/2.2/sections.html
It will be much easier to do it in your script. If you are running on PHP try:
header('X-Robots-Tag: noindex, follow');
You can easily build conditions on $_GET, REQUEST_URI and so on.
RewriteEngine on
RewriteBase /
#set env variable if url matches
RewriteCond %{QUERY_STRING} ^route=news/headlines$
RewriteRule ^index\.php$ - [env=NOINDEXFOLLOW:true]
#only sent header if env variable set
Header set X-Robots-Tag "noindex, follow" env=NOINDEXFOLLOW
FilesMatch works on (local) files, not urls. So it would try to match only the /index.php part of the url. <location> would be more appropriate, but as far as I can read from the documentation, querystrings are not allowed here. So I ended up with the above solution (I really liked this challenge). Although php would be the more obvious place to put this, but that is up to you.
The solution requires mod_rewrite, and mod_headers of course.
Note that you'll need the mod_headers module enabled to set the headers.
Though like others have said, it seems better to use the php tag. Does that not work?
According to Google the syntax would be a little different:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag

Htaccess rewrite does not work

I was told that this is the right way to redirect anyone who is trying to open:
/users/username/something.txt
But i can't seem to get it work.
RewriteEngine on
RewriteRule \.txt$ /notallowed.html [F,L,NC]
Is this wrong?
The simplest way to deny users from all TXT files would be to use something like:
<FilesMatch "\.(txt)$">
Order Allow,Deny
Deny from all
</FilesMatch>
However, the code you have there should work for all intents and purposes. Depending on your server configuration, however, you may need to add "Options +FollowSymLinks".
If you decide to go the FilesMatch route, you can use ErrorDocument to control what page the user is taken to.

Resources