Squid + squidGuard not enforcing safe search on duckduckgo.com - search

The purpose of this project is to force safe search on major search engines.
I managed to install Squid (version 3.3) and SquidGuard, configured Squid as transparent proxy with SSL interception...
I managed to enforce safe search on Google, Yahoo and Bing, but I can't with Duckduckgo and I can't find any reasonable explanation (either on my own or in the web).
My Squid.conf is:
acl localnet src 192.168.1.0/24 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machin$
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
acl engines dstdomain .yahoo.com
acl engines dstdomain .duckduckgo.com
acl engines dstdomain .google.com
acl engines dstdomain .bing.com
cache deny all
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
log_access allow all
url_rewrite_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
url_rewrite_children 500
http_access allow localnet
http_access allow localhost
http_access deny all
http_port 3129
http_port 3128 intercept
https_port 3130 intercept ssl-bump connection-auth=off generate-host-certificates=on cert=/etc/squid/control.com.au.pem key=/etc/squid/control.com.au.pem cipher=ECDHE-RSA-RC4-SHA:ECDHE-RSA-AES128-SHA:DHE-RSA-AES128-SHA:DHE-RSA-CAMELLIA128-SHA:RC4-SHA:HIGH:!aNull:!MD5:!ADH
ssl_bump none localhost
ssl_bump server-first engines
#ssl_bump server-first all
ssl_bump none all
always_direct allow all
sslproxy_cert_error deny all
sslproxy_flags DONT_VERIFY_PEER
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
And the rewrite rule in SquidGuard is:
rewrite engines {
s#.*bing.com/search.*#&\&adlt=strict#i
s#.*bing.com/images.*#&\&adlt=strict#i
s#.*bing.com/videos.*#&\&adlt=strict#i
s#.*au.search.yahoo.com.*#&\&vm=r#i
s#.*duckduckgo.com.*#&\&kp=1#i
s#.*google.com.au.*#1&safe=strict#i
s#.*google.com.*#1&safe=strict#i
s#.*bing.com.*#&\&adlt=strict#i
}
I am pretty sure the squidGuard rewrite rule is fine because if I change the Squid configuration to intercept ALL SSL communication then duckduckgo.com gets enforced.
The question is what shall I enter instead of:
acl engines dstdomain .duckduckgo.com
??????????
Thanks in advance

I bet the above does not work with SquidGuard after June 23, 2015
"On 23 June 2015 the Google search services will move all search results behind SSL encryption. This means that all search results will then be served using 'https', with the secure padlock shown in web browsers."
Many schools and business are so pissed off they are now using:
"'SSL interception' functionality that can intercept and filter Google search results after Google implement their change. This also allows to subsequently address existing issues with other Google services like YouTube that have already moved to SSL."

You can force TRANSPARENT safe search for google (http and https) by setting:
Configure
set service dns forwarding options address=/.google.com/216.239.38.120
commit
save
DONE !!!! It works.
EXTRA BONUS:
IF YOU WANT TO BLOCK ALL ACCESS to ask and bing and duckduckgo and other domains, use:
configure
set service dns forwarding options address=/.bing.com/216.239.38.120
set service dns forwarding options address=/.ask.com/216.239.38.120
set service dns forwarding options address=/.duckduckgo.com/216.239.38.120
commit
save
This blocks bing and ask and duckduckgo domains on both http and https.

This is a little over a year after the fact, but I found this thread trying to solve this exact problem myself, so here goes.
In your squid config, you have:
acl engines dstdomain .yahoo.com
acl engines dstdomain .duckduckgo.com
acl engines dstdomain .google.com
acl engines dstdomain .bing.com
But that implies any subdomain beneath duckduckgo.com (i.e. www.duckduckgo.com, search.duckduckgo.com), but not duckduckgo.com.
When I do a DDG search, it's just using https://duckduckgo.com/$search_string, as so:
example duckduckgo search
So in short, your explicit ssl-bump acl engines is not matching duckduckgo because it's expecting subdomains, not the domain itself. When you change your config to "bump all", it's obviously catching it then, as it's catching everything.
If you exchange this line
acl engines dstdomain .duckduckgo.com
For this line
acl engines dstdomain duckduckgo.com
It'll work.

Related

subdomain redirect to same backend in haproxy

how can i configure host path redirection of all sub domains to a same backed server.
For example
my domain is example.com
sub domains are *.example.com
I need to redirect *.example.com/abc/ to another backed server.
My frontend ACLs are
acl host_star hdr(host) -i *.example.com
use_backend back_live if host_star
acl is_node path_beg -i /abc/
use_backend backend_node if host_star is_node
I need to go abc.example.com/abc/ and xyz.example.com/abc/ to same backend server
I completed it by using hdr_end(host)
acl host_star hdr_end(host) -i .example.com

How can I block all IP's, but allow 1 server ip in .htaccess

I'm trying to deny all requests sent to a website, but allow only 2 IP-addresses.
I've learned this should be done with .htaccess.
Basically there are 3 modules: Website Server, Form-handling Server and my own network IP.
Let's appoint the following IP addresses to the servers:
Website Server: 1.1.1.1
Form-handling Server: 2.2.2.2
Own Network: 3.3.3.3
The .htaccess is placed in the public_html directory of the form-handling server (2.2.2.2).
Now, this works:
order deny,allow
deny from all
allow from 3.3.3.3
The form-handling server is accessible with my own browser, but the form post request sent from the website is blocked. (which is good, in this case)
But when I edit the .htaccess to the following, the form post request is still blocked:
order deny,allow
deny from all
allow from 1.1.1.1
allow from 3.3.3.3
To make sure this .htaccess is functional I tried:
order deny,allow
deny from all
allow from 1.1.1.1
Now I cannot reach the Form-handling Server. Which proves the .htaccess is 'running'. (also, the Website Server cannot access the server tho..)
How can I achieve that the Website server has access to the Form-handling Server (and preferably me as well), but any other visitor/server hasn't?
Worth knowing: When I delete these lines from my .htaccess, the connection between the Website and Form-handling server works beautifully.
I am pretty sure your htaccess is ok. Most likely your webserver connects the form server with a different ip - i.e. the IP from the internal LAN between your webserver and your form server is different.

Websever Setup Troubleshooting

I am attempting to setup a web server and I'm not sure where to proceed troubleshooting. Here's where I am at:
I installed apache
I pointed my domain to afraid.org (Dynamic DNS service) and installed their software on the server
Enabled port forwarding on my router for HTTP (80) VNC (5500), SSH (22). As well as enabled DMZ host for the server. My router is a Westell 7500.
The server hosts on the local network appropriately using both the the server's IP or my domain -- which I believe indicates the Dynamic DNS is working. However I cannot access the website on another network.
Here are the contents for my ports.conf file:
NameVirtualHost *:80
Listen 80
<IfModule mod_ssl.c>
# If you add NameVirtualHost *:443 here, you will also have to change
# the VirtualHost statement in /etc/apache2/sites-available/default-ssl
# to <VirtualHost *:443>
# Server Name Indication for SSL named virtual hosts is currently not
# supported by MSIE on Windows XP.
Listen 443
</IfModule>
<IfModule mod_gnutls.c>
Listen 443
</IfModule>`
Check configuration of iptables. And also check if virtual server is properly configured to accept connections from outside.

Configure squid on centos to only allow one ip

I setup squid on centos 6.4 using a guide I found through google. I am using a VPS and connect to it from my home computer to browse anonymously and connect to an ftp server for work. It is working fine, however as of right now anyone can connect to the proxy. How do I limit it to only allow my home ip?
Here is my config,
#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
http_access allow all
http_access allow localnet
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port 0.0.0.0:3128
# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
# Uncomment and adjust the following to add a disk cache directory.
#cache_dir ufs /var/spool/squid 100 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
via off
forwarded_for off
request_header_access Allow allow all
request_header_access Authorization allow all
request_header_access WWW-Authenticate allow all
request_header_access Proxy-Authorization allow all
request_header_access Proxy-Authenticate allow all
request_header_access Cache-Control allow all
request_header_access Content-Encoding allow all
request_header_access Content-Length allow all
request_header_access Content-Type allow all
request_header_access Date allow all
request_header_access Expires allow all
request_header_access Host allow all
request_header_access If-Modified-Since allow all
request_header_access Last-Modified allow all
request_header_access Location allow all
request_header_access Pragma allow all
request_header_access Accept allow all
request_header_access Accept-Charset allow all
request_header_access Accept-Encoding allow all
request_header_access Accept-Language allow all
request_header_access Content-Language allow all
request_header_access Mime-Version allow all
request_header_access Retry-After allow all
request_header_access Title allow all
request_header_access Connection allow all
request_header_access Proxy-Connection allow all
request_header_access User-Agent allow all
request_header_access Cookie allow all
request_header_access All deny all
tcp_outgoing_address public_ip
In the default configuration which you had part of there is an acl called localnet. You could modify the localnet acl to include only your source address from where you are connecting instead of the defaults, or you could make your own acl for your source address. You need to use it in the http_access commands which grant access.
I see all the following http_access commands used in your configuration:
http_access allow manager localhost #1
http_access deny manager #2
http_access allow all #3
http_access allow localnet #4
http_access deny !Safe_ports #5
http_access deny CONNECT !SSL_ports #6
http_access allow localnet #7
http_access allow localhost #8
http_access deny all #9
4 and 7 are redundant. 3 could be removed if your source address matched localnet. 3 also covers up some of the security features given in 5 and 6. I propose the following where localnet acl has been modified for only your source address:
acl localnet src <source_ip>
http_access allow manager localhost #1
http_access deny manager #2
http_access deny !Safe_ports #3
http_access deny CONNECT !SSL_ports #4
http_access allow localnet #5
http_access allow localhost #6
http_access deny all #7
I think that would do the trick.

Restrict Squid access to only one site

How do I restrict access to only one website through my Squid proxy?
The following doesn't work...
acl amazon_ireland src 79.125.0.0/17
acl some_site url_regex google
http_access allow amazon_ireland
http_access allow some_site
http_access deny all
Have a look at the squid FAQ, there is a perfect example for your setup.
Squid FAQ
acl GOOD dst 10.0.0.1
http_access allow GOOD
http_access deny all
if you want to match by domain-name
acl GOOD dstdomain .amazon.ie
http_access allow GOOD
http_access deny all
Here is another way, where you can specify the source IP addres. And block specific sites such as youtube facebook
acl liberaip src 130.44.0.215
acl block url_regex -i youtube facebook
http_access allow liberaip !block

Resources