Configure squid on centos to only allow one ip - linux

I setup squid on centos 6.4 using a guide I found through google. I am using a VPS and connect to it from my home computer to browse anonymously and connect to an ftp server for work. It is working fine, however as of right now anyone can connect to the proxy. How do I limit it to only allow my home ip?
Here is my config,
#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
http_access allow all
http_access allow localnet
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 0.0.0.0:3128
# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
via off
forwarded_for off
request_header_access Allow allow all
request_header_access Authorization allow all
request_header_access WWW-Authenticate allow all
request_header_access Proxy-Authorization allow all
request_header_access Proxy-Authenticate allow all
request_header_access Cache-Control allow all
request_header_access Content-Encoding allow all
request_header_access Content-Length allow all
request_header_access Content-Type allow all
request_header_access Date allow all
request_header_access Expires allow all
request_header_access Host allow all
request_header_access If-Modified-Since allow all
request_header_access Last-Modified allow all
request_header_access Location allow all
request_header_access Pragma allow all
request_header_access Accept allow all
request_header_access Accept-Charset allow all
request_header_access Accept-Encoding allow all
request_header_access Accept-Language allow all
request_header_access Content-Language allow all
request_header_access Mime-Version allow all
request_header_access Retry-After allow all
request_header_access Title allow all
request_header_access Connection allow all
request_header_access Proxy-Connection allow all
request_header_access User-Agent allow all
request_header_access Cookie allow all
request_header_access All deny all
tcp_outgoing_address public_ip

In the default configuration which you had part of there is an acl called localnet. You could modify the localnet acl to include only your source address from where you are connecting instead of the defaults, or you could make your own acl for your source address. You need to use it in the http_access commands which grant access.
I see all the following http_access commands used in your configuration:
http_access allow manager localhost #1
http_access deny manager #2
http_access allow all #3
http_access allow localnet #4
http_access deny !Safe_ports #5
http_access deny CONNECT !SSL_ports #6
http_access allow localnet #7
http_access allow localhost #8
http_access deny all #9
4 and 7 are redundant. 3 could be removed if your source address matched localnet. 3 also covers up some of the security features given in 5 and 6. I propose the following where localnet acl has been modified for only your source address:
acl localnet src <source_ip>
http_access allow manager localhost #1
http_access deny manager #2
http_access deny !Safe_ports #3
http_access deny CONNECT !SSL_ports #4
http_access allow localnet #5
http_access allow localhost #6
http_access deny all #7
I think that would do the trick.

Related

Allow certain IP ranges and googlebot, deny the rest, for certain URLs only

I need to modify and combine some .htaccess rules, in order to meet 3 separate conditions:
Allow access from certain IP ranges, deny the rest
Allow googlebot, no matter what IP
All this applies only to any URL containing 'theword'
<Directory "containing *theword*">
Order allow,deny
Deny from all
Allow from 192.168.0.2
Allow from 10.0.0.1
</Directory>
<Files "containing *theword*">
Order allow,deny
Deny from all
Allow from 192.168.0.2
Allow from 10.0.0.1
</Files>
I need help formatting rule with the files/directories containing theword and add exception for the googlebot.

How do I block IPs using htaccess

How can I stop these IP addresses from hitting my site (185...* and 93.127..).
I have tried these in .htaccess:
Order Allow,Deny
Deny from 185.162.0.0/15
Deny from 185.164.0.0/14
Deny from 185.168.0.0/13
Deny from 185.176.0.0/12
Deny from 185.192.0.0/11
Deny from 93.127.0.0/16
Allow from all
and bots from http://tab-studio.com/en/blocking-robots-on-your-page/ but I continue to be hit.
Good: Edit (or create) the .htaccess file in the root of your server and put the following code:
order allow, deny
deny from 210.140.98.160
deny from 69.197.132.70
deny from 74.14.13.236
allow from all

How to block access by IP using .htaccess

I need to block access to my site by IP address, but for some reason when I use the following I have error 500.
here is what I am trying to use:
#Deny Access to Adsense SPAM
Order Deny,Allow
deny from 209.51.197.0/24; # XLHOST IP
deny from 209.190.121.32/27; #XLHOST IP
deny from 209.190.0.0/17; # XLHOST IP
deny from 173.45.64.0/18; # XLHOST IP
deny from 64.79.64.0/19; # XLHOST IP
deny from 64.79.89.0/19; # XLHOST IP
deny from 64.79.85.0/19; # XLHOST IP
allow from all
Please help
Try without ;
#Deny Access to Adsense SPAM
Order Deny,Allow
deny from 209.51.197.0/24 # XLHOST IP
deny from 209.190.121.32/27 #XLHOST IP
deny from 209.190.0.0/17 # XLHOST IP
deny from 173.45.64.0/18 # XLHOST IP
deny from 64.79.64.0/19 # XLHOST IP
deny from 64.79.89.0/19 # XLHOST IP
deny from 64.79.85.0/19 # XLHOST IP
allow from all
xlhost com inc uses the firefox version 27.
Put this code in your htaccess:
RewriteCond %{HTTP_USER_AGENT} Firefox/27\.0 [NC]
RewriteRule .* - [F,L]
Source webmaster.net

Squid + squidGuard not enforcing safe search on duckduckgo.com

The purpose of this project is to force safe search on major search engines.
I managed to install Squid (version 3.3) and SquidGuard, configured Squid as transparent proxy with SSL interception...
I managed to enforce safe search on Google, Yahoo and Bing, but I can't with Duckduckgo and I can't find any reasonable explanation (either on my own or in the web).
My Squid.conf is:
acl localnet src 192.168.1.0/24 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machin$
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
acl engines dstdomain .yahoo.com
acl engines dstdomain .duckduckgo.com
acl engines dstdomain .google.com
acl engines dstdomain .bing.com
cache deny all
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
log_access allow all
url_rewrite_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
url_rewrite_children 500
http_access allow localnet
http_access allow localhost
http_access deny all
http_port 3129
http_port 3128 intercept
https_port 3130 intercept ssl-bump connection-auth=off generate-host-certificates=on cert=/etc/squid/control.com.au.pem key=/etc/squid/control.com.au.pem cipher=ECDHE-RSA-RC4-SHA:ECDHE-RSA-AES128-SHA:DHE-RSA-AES128-SHA:DHE-RSA-CAMELLIA128-SHA:RC4-SHA:HIGH:!aNull:!MD5:!ADH
ssl_bump none localhost
ssl_bump server-first engines
#ssl_bump server-first all
ssl_bump none all
always_direct allow all
sslproxy_cert_error deny all
sslproxy_flags DONT_VERIFY_PEER
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
And the rewrite rule in SquidGuard is:
rewrite engines {
s#.*bing.com/search.*#&\&adlt=strict#i
s#.*bing.com/images.*#&\&adlt=strict#i
s#.*bing.com/videos.*#&\&adlt=strict#i
s#.*au.search.yahoo.com.*#&\&vm=r#i
s#.*duckduckgo.com.*#&\&kp=1#i
s#.*google.com.au.*#1&safe=strict#i
s#.*google.com.*#1&safe=strict#i
s#.*bing.com.*#&\&adlt=strict#i
}
I am pretty sure the squidGuard rewrite rule is fine because if I change the Squid configuration to intercept ALL SSL communication then duckduckgo.com gets enforced.
The question is what shall I enter instead of:
acl engines dstdomain .duckduckgo.com
??????????
Thanks in advance
I bet the above does not work with SquidGuard after June 23, 2015
"On 23 June 2015 the Google search services will move all search results behind SSL encryption. This means that all search results will then be served using 'https', with the secure padlock shown in web browsers."
Many schools and business are so pissed off they are now using:
"'SSL interception' functionality that can intercept and filter Google search results after Google implement their change. This also allows to subsequently address existing issues with other Google services like YouTube that have already moved to SSL."
You can force TRANSPARENT safe search for google (http and https) by setting:
Configure
set service dns forwarding options address=/.google.com/216.239.38.120
commit
save
DONE !!!! It works.
EXTRA BONUS:
IF YOU WANT TO BLOCK ALL ACCESS to ask and bing and duckduckgo and other domains, use:
configure
set service dns forwarding options address=/.bing.com/216.239.38.120
set service dns forwarding options address=/.ask.com/216.239.38.120
set service dns forwarding options address=/.duckduckgo.com/216.239.38.120
commit
save
This blocks bing and ask and duckduckgo domains on both http and https.
This is a little over a year after the fact, but I found this thread trying to solve this exact problem myself, so here goes.
In your squid config, you have:
acl engines dstdomain .yahoo.com
acl engines dstdomain .duckduckgo.com
acl engines dstdomain .google.com
acl engines dstdomain .bing.com
But that implies any subdomain beneath duckduckgo.com (i.e. www.duckduckgo.com, search.duckduckgo.com), but not duckduckgo.com.
When I do a DDG search, it's just using https://duckduckgo.com/$search_string, as so:
example duckduckgo search
So in short, your explicit ssl-bump acl engines is not matching duckduckgo because it's expecting subdomains, not the domain itself. When you change your config to "bump all", it's obviously catching it then, as it's catching everything.
If you exchange this line
acl engines dstdomain .duckduckgo.com
For this line
acl engines dstdomain duckduckgo.com
It'll work.

Restrict Squid access to only one site

How do I restrict access to only one website through my Squid proxy?
The following doesn't work...
acl amazon_ireland src 79.125.0.0/17
acl some_site url_regex google
http_access allow amazon_ireland
http_access allow some_site
http_access deny all
Have a look at the squid FAQ, there is a perfect example for your setup.
Squid FAQ
acl GOOD dst 10.0.0.1
http_access allow GOOD
http_access deny all
if you want to match by domain-name
acl GOOD dstdomain .amazon.ie
http_access allow GOOD
http_access deny all
Here is another way, where you can specify the source IP addres. And block specific sites such as youtube facebook
acl liberaip src 130.44.0.215
acl block url_regex -i youtube facebook
http_access allow liberaip !block

Resources