Firewall Wizards mailing list archives

RE: recommendations for URL filtering


From: "O'Shea, Dave" <dave.oshea () wilcom com>
Date: Tue, 25 Jan 2000 21:13:36 -0600

Way Back When, I wrote a hack to the CERN httpd proxy that searched for
various keywords within a URL, and blocked access to them. I seem to recall
that it didn't take a whole lot of work, just a bit of jiggering with the
httpd.config file. 
 

 -----Original Message-----
From:   Antonomasia [mailto:ant () notatla demon co uk] 
Sent:   Sunday, January 23, 2000 6:22 PM
To:     firewall-wizards () nfr net
Subject:        recommendations for URL filtering


I have been asked to look at ways of filtering URLs to people browsing
from a business site.  (FW-1 is used there, but they seem to have
discounted that option.  I see from a post on this list from 1997 that
one reader was using this with partial success.  I know Raptor has scope
for this, but is not used here.)

To my mind this looks like an easy extension to squid, particularly if
you are using an external redirector program.  The intended form of blocking
has not yet been described to me - whether it is to be based on a whitelist
of domain names, filename extensions, MIME types or what.

It is also said there will be requirements for logging and for identifying
human browsers on NT.  What does NT have as rough "ident"/"rusers"
equivalents ?  Or means of authenticating users in advance to a proxy ?

What can people suggest ?   I'll summarise any useful stuff sent off list.

--
##############################################################
# Antonomasia   ant () notatla demon co uk                      #
# See http://www.notatla.demon.co.uk/                        #
##############################################################



Current thread: