Is my server-side content safe? - security

I am getting this in my apache error_log and I am using AWS
[Mon Oct 31 08:24:47.120132 2016] [:error] [pid 8216] [client 95.213.177.126:34294] script '/var/www/html/azenv.php' not found or unable to stat, referer: https://proxyradar.com/
I wanted to know what does that mean.

It looks like a request was made for /var/www/html/azenv.php, which was not found. That's a standard 404 error, when a request has been made for a non-existent file.
As to WHY the request was made, that's for the sender to determine.
See Apache access log full of unauthorized and suspicious requests, how to take action, which suggests that it is automated scripts looking for vulnerabilities of unpatched servers.
The fact that the file was not found should give you some comfort, because that's one vulnerability to which you were not vulnerable.

Related

Apache 403 everywhere with XAMPP Linux

I'm fairly new to using Apache, we use XAMPP here so I need to use that for simplicities sake. I'm having a problem getting Apache to view my files. I get this error in the logs for httpd:
[Thu Sep 17 16:16:46.944172 2020] [core:error] [pid 10036] (13)Permission denied: [client ::1:39318] AH00035: access to / denied (filesystem path '/home/mrblob/Documents') because search permissions are missing on a component of the path
[Thu Sep 17 16:16:47.170688 2020] [core:error] [pid 10036] (13)Permission denied: [client ::1:39318] AH00035: access to /favicon.ico denied (filesystem path '/home/mrblob/Documents') because search permissions are missing on a component of the path, referer: http://localhost:81/
My files for my website is /home/mrblob/Documents/web/
I've got this in my httpd.cnf file:
<Directory "/home/mrblob/Documents/web/htdocs">
Options Indexes FollowSymLinks Includes ExecCGI
Require all granted
Order allow,deny
AllowOverride None
Allow from all
</Directory>
I've also tried chmod on different permissions... I've tried a lot of different things... I'm yet to get it to work. FYI phpmyadmin as well as XAMPP's dashboard works fine. Any other pages that I want throws 403.
Thanks.
So I worked out I just needed to give access to Apache, well, I knew I needed to do this but I tried everything I thought of. My
sudo chmod ugo+rwx
Which I know its not the most secure way of doing it, but I'm only local hosting for now so it's not a huge deal.

Failing miserably with .htaccess rewrite, unsure how to troubleshoot

I have been trying to figure out URL rewriting on my local development site all day with no luck. Initially the .htaccess files were ignored. Now they are being read but not working. I don't know how to troubleshoot an .htaccess file though. From what I have read, it seems Apache 2.4 got rid of specifying your own RewriteLog. The only help I am getting is from /var/log/apache2/error.log which is all Greek to me.
The rewrite I am attempting is simply:
local.domain.com/users/index.php?id=1 -> local.domain.com/users/1/
My .htaccess looks like:
RewriteEngine On
RewriteRule ^users/([0-9]+)/?$ users/index.php?id=$1 [NC,L]
When requesting local.domain.com/users/index.php?id=1 , the URL remains unchanged. The error.log for apache gives the following 3 lines:
[Tue Jun 17 15:20:04.705939 2014] [rewrite:trace3] [pid 6569] mod_rewrite.c(468): [client 127.0.0.1:46208] 127.0.0.1 - - [local.domain.com/sid#b63f02c0][rid#b6b12058/initial] [perdir /var/www/vhosts/domain.com/] strip per-dir prefix: /var/www/vhosts/domain.com/users/index.php -> users/index.php
[Tue Jun 17 15:20:04.705979 2014] [rewrite:trace3] [pid 6569] mod_rewrite.c(468): [client 127.0.0.1:46208] 127.0.0.1 - - [local.domain.com/sid#b63f02c0][rid#b6b12058/initial] [perdir /var/www/vhosts/domain.com/] applying pattern '^users/([0-9]+)$' to uri 'users/index.php'
[Tue Jun 17 15:20:04.705990 2014] [rewrite:trace1] [pid 6569] mod_rewrite.c(468): [client 127.0.0.1:46208] 127.0.0.1 - - [local.domain.com/sid#b63f02c0][rid#b6b12058/initial] [perdir /var/www/vhosts/domain.com/] pass through /var/www/vhosts/domain.com/users/index.php
The location of the .htaccess is /var/www/vhosts/domain.com/ . Is there a way to get better/more log info? Is the /var/log/apache2/error.log really the log I should be using for this? Is the problem really with my .htaccess code or is there some sort of configuration I am missing or something? I know there are similar questions but so far I haven't found one that was both understandable and a solution to my problem.
Thanks in advance!
You are rewriting an incoming URI /users/1/ to /users/index.php?id=1 (SEO form to dynamic form). Your .htaccess looks correct for that (I assume it's in the root).
Are you sure that your Apache server is built with the RewriteEngine enabled? Are you overlooking an error message? Your typed-in URI is /users/1/?
You regex is looking for [0+9]+ which means it is looking for numbers after /users. In your input url (assuming that's your input url), /var/www/vhosts/domain.com/users/index.php there is no numbers after `'/users'
Put a number after users like /users/55555/ and see what happens.

max REST URI supported by Apache httpd

I'm chasing this problem where my REST client (curl) program is sending a very long URI(5000 chars) to my Apache httpd 2.2.15 (RHEL6) which is being denied. From the Apache docs, i read that default max. URI length supported is 8190 (via LimitRequestLine here), but when I'm giving a URI of 5000 chars (like somehost/dir1/dir2/dir3/.../dir700/ ), i'm getting this error in ssl_error_log file:
[Tue Apr 02 17:29:16 2013] [error] [client 10.0.100.1] (36)File name too long: Cannot map GET <<long URI>> HTTP/1.1 to file
From the apache code, this looks to be hitting the PATH_MAX(4096) limit of Linux. If this is the case, then how can I make sure that URI are honoured up to 8190 chars? OR else is there any other limit which restricts the path to 4096 chars?

Using .htaccess to block referrer spam

Our forum gets targeted a lot by automated bots that try to register automatically.
We can see an example here from the error log
[Sun Apr 03 14:04:46 2011] [error]
[client 70.183.110.133] File does not
exist:
/home/spoilert/public_html/forum/++++++++++++++++++++++++++++++++++++Result:+captcha+decoded+(23+attempts);+registered+(registering+only+mode+is+ON);,
referer:
http://forum.spoilertv.co.uk/++++++++++++++++++++++++++++++++++++Result:+captcha+decoded+%2823+attempts%29;+registered+%28registering+only+mode+is+ON%29;
[Sun Apr 03 13:45:54 2011] [error]
[client 70.183.110.133] File does not
exist:
/home/spoilert/public_html/2008,
referer:
I've updated my htaccess with this code
SetEnvIfNoCase Referer
"^http://(W)decoded.*$" banned
Deny
from env=banned
It "should" deny any referrer link with the word decoded in it but it seems that it's not working. I still seem to be getting a few of these robots getting through with the same URL so it seems that it's still happening.
What happens if you change it to
SetEnvIfNoCase Referer ".*+decoded+.*" banned
Deny from env=banned

Internal Server Error

The error message I gen when I try to access the web page server "192.168.50.29/cgi-bin/tinyPL.cgi"; looks like this:
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator, root#localhost and inform them of the time the error occurred, and anything you might have done that may have caused the error.
More information about this error may be available in the server error log.
Apache/2.2.11 (Fedora) Server at 192.168.50.29 Port 80
Error_log :
[Sat Oct 24 21:30:47 2009] [notice] suEXEC mechanism enabled (wrapper: /usr/sbin/suexec)
[Sat Oct 24 21:30:47 2009] [notice] Digest: generating secret for digest authentication ...
[Sat Oct 24 21:30:47 2009] [notice] Digest: done
[Sat Oct 24 21:30:48 2009] [notice] Apache/2.2.11 (Unix) DAV/2 PHP/5.2.9 mod_ssl/2.2.11 OpenSSL/0.9.8g mod_perl/2.0.4 Perl/v5.10.0 configured -- resuming normal operations
[Sat Oct 24 21:30:50 2009] [error] [client 192.168.50.69] (13)Permission denied: exec of '/var/www/cgi-bin/tinyPL.cgi' failed
[Sat Oct 24 21:30:50 2009] [error] [client 192.168.50.69] Premature end of script headers: tinyPL.cgi
Could any one help me on this!
Your log file will have more details regarding the error, but an Internal Server error on a CGI script usually means that when the server tried to execute your CGI program the expected headers was not present.
In a perl script, that would be (for example):
use CGI qw(:standard);
print header();
Which will print out something like:
Content-type: text/html
Try and run your CGI script from the commandline and see if prints out those lines. The other problem might be due to access permissions. Apache might not be able to execute your script.

Resources