Project

General

Profile

Actions

Bug #585

open

Feature #145: Improve Epiphany user's experience

DansGuardian cannot filter Https sites

Added by Jean-Michel Philippe over 11 years ago. Updated over 11 years ago.

Status:
New
Priority:
Urgent
Assignee:
Category:
System
Target version:
Start date:
08/28/2012
Due date:
09/30/2012 (over 11 years late)
% Done:

20%

Estimated time:
3:00 h
Spent time:

Description

Since communications with https is encrypted, it is not possible to analyze the traffic of https sites with DansGuardian in the current configuration. The online documentation tells about it:

http://contentfilter.futuragts.com/wiki/doku.php?id=faq#installation_and_problem_faq

It may be possible to configure Squid to be a transparent https proxy, but this is not sure. A possible solution can than be to configure Epiphany to use DansGuardian directly, while the web traffic redirections currently in place would prevent the smarter kids to bypass controls by just removing the web browser proxy settings.

Actions #1

Updated by Gérald Kerma over 11 years ago

Installation#6. Can DansGuardian filter the content of https: (encrypted/443/ssl/tls) traffic? Can it even control https: access to websites?
? Access control depends on which configuration family you've chosen. ?

In explicit-proxy environments, DansGuardian uses its configured lists of sites (bannedsitelist, exceptionsitelist, blacklists) to vett connections for both http: and https: traffic (provided the https: traffic goes through DansGuardian). However the URL path and the content are encrypted so they cannot be analyzed (or even logged). In other words …urllist, …regexpurllist, and weighted… do not apply to https: traffic, not even in these environments.

In transparent-intercepting environments, https: traffic doesn't go through DansGuardian at all; DansGuardian doesn't even have access to the website name. (This is a fundamental restriction, not something that can be “fixed” - redirecting port 443 traffic into DansGuardian won't work.) DansGuardian options that control https: filtering (for example “Blanket SSL Block”) have no effect at all in these environments.

(Inability to look inside encrypted traffic is a generic restriction, not something specific to DansGuardian. After all, if some man-in-the-middle could intercept and analyze the traffic [and see your credit card number], it wouldn't really be “secure”, would it? Currently although there are a few commercial products that begin to address this issue, no open source software can scan encrypted content.)

From : http://contentfilter.futuragts.com/wiki/doku.php?id=faq

Actions #2

Updated by Jean-Michel Philippe over 11 years ago

Next topic in this page seems to be the answer we're looking for:

Installation#6b. Why in techspeak can't https: traffic be transparently redirected through DansGuardian?
Sometimes it doesn't seem to make sense that https: traffic can't be transparently redirected through DansGuardian/Squid. Why should it make any difference whether traffic was sent to some port by the browser or was redirected to that port by an IPtables rule? In fact it does make a difference; here's a detailed inside explanation of why:

The browser behaves in a fundamentally different way when contacting an https: site through a local proxy such as DansGuardian/Squid, because it makes a small, unencrypted request to the local proxy itself, asking the proxy to open up a “tunnel” to the site for it to send encrypted traffic through. But in order to do this, the browser has to know that a local proxy is in use, which means the proxy settings have to be configured directly in the browser. Redirecting port 443 at the firewall level won't work, because DansGuardian/Squid won't understand the already-encrypted traffic which gets redirected to them.

Actions #3

Updated by Jean-Michel Philippe over 11 years ago

It seems https content cannot be analyzed whatever the configuration of DansGuardian. The following page:

http://contentfilter.futuragts.com/wiki/doku.php?id=two_configuration_families

says:

End user configuration may not be entirely automatic [although clever use of proxy auto-detect (PAC, WPAD, etc.) might ameliorate this issue]. Https filtering, while improved, is still quite limited– https: hosts will be vetted by domain name, but the rest of the URL is still opaque and the encrypted content still cannot be scanned. This configuration family enables one of the tightest filtering environments possible at this time with open source software.

Actions #4

Updated by Gérald Kerma over 11 years ago

http://wiki.squid-cache.org/Features/SslBump

Squid-in-the-middle decryption and encryption of straight CONNECT and transparently redirected SSL traffic, using configurable CA certificates. While decrypted, the traffic can be analyzed, blocked, or adapted using regular Squid features such as ICAP and eCAP.

/!\ By default, most user agents will warn end-users about a possible man-in-the-middle attack.

{X} WARNING: {X} HTTPS was designed to give users an expectation of privacy and security. Decrypting HTTPS tunnels without user consent or knowledge may violate ethical norms and may be illegal in your jurisdiction. Squid decryption features described here and elsewhere are designed for deployment with user consent or, at the very least, in environments where decryption without consent is legal. These features also illustrate why users should be careful with trusting HTTPS connections and why the weakest link in the chain of HTTPS protections is rather fragile. Decrypting HTTPS tunnels constitutes a man-in-the-middle attack from the overall network security point of view. Attack tools are an equivalent of an atomic bomb in real world: Make sure you understand what you are doing and that your decision makers have enough information to make wise choices.

Actions #5

Updated by Jean-Michel Philippe over 11 years ago

Man-in-the-middle, the only true solution in our modern Internet world :p !!!!

Actions #6

Updated by Gérald Kerma over 11 years ago

  • Assignee set to Gérald Kerma
  • % Done changed from 0 to 20

First we need to create a root CA (myCA.pem) into /usr/local/squid/ssl_cert with :
mkdir -p /usr/local/squid/ssl_cert
cd /usr/local/squid/ssl_cert
openssl req -new -newkey rsa:1024 -days 365 -nodes -x509 -keyout myCA.pem -out myCA.pem

-> For Firefox
openssl x509 -in myCA.pem -outform DER -out myCA.der

http://wiki.squid-cache.org/Features/SslBump

-> for squid3.conf
http_port 3128 sslBump cert=/usr/local/squid/ssl_cert/myCA.pem

  1. Bumped requests have relative URLs so Squid has to use reverse proxy or accelerator code. By default, that code denies direct forwarding.
  2. The need for this option may disappear in the future.
    always_direct allow all
  1. ignore certain certificate errors (very dangerous!)
    acl BadSite ssl_error SQUID_X509_V_ERR_DOMAIN_MISMATCH
    sslproxy_cert_error allow BadSite
    sslproxy_cert_error deny all

---
Then with a custom packaged squid (with --enable-ssl flag add to debian/rules) and replace in networks scripts squid with squid3.
Disabling squid service and using squid3 instead is passing through the ssl with MITM local alternative.

Need to look into dansguardian for filtering SSL being activated !?

Actions

Also available in: Atom PDF