Firewall Wizards mailing list archives

Re: recommendations for URL filtering


From: Antonomasia <ant () notatla demon co uk>
Date: Wed, 26 Jan 2000 12:01:28 GMT

I asked for suggestions in restricting and logging web access and
identifying users.  I speculated that the redirection feature of
squid would be a good hook to hang this on.  Thanks to all.


Philip Plane <philip ra.tepapa.govt.nz>
Squid lets you do this easily. even without an external redir program.
It lets you filter URLs based on regular expressions.


"Crumrine, Gary L" <CrumrineGL state.gov> wants other people maintaining
his blacklist and writes:
                     ... Your only option once they decide to implement is
to use a service.  AXENT's Raptor uses a product called web-not.  There are
others out there as well. ...


spiff <spiff bway.net>
try SquidGuard
http://info.ost.eltele.no/freeware/squidGuard/

This product has done what I thought of - it looks pretty comprehensive
from the docs, which also mention "our free ident daemon for M$ Windoze"
and recommend squid-2.1.PATCH2 if RFC931 is used.


Karl Greenwood <Karl pt-services.co.uk>
if u use ms proxy before hitting the firewall then you can use nt security
to allow access to the various protocols, also proxy will deny access to any
sites you specify...

tie down the internet browser by using a central config file so you lock out
scripts, java or whatever on their browsers and force them to go through the
proxy

the proxy will also log all accesses to the internet by the internal
workstations ip address, if auditing of sucessful logons is enabled then
which user was using that machine at that time can be found

It turns out that they want real-time user identification and for the
rule list to depend on this.


"Moore, James" <James.Moore msfc.nasa.gov>
You might take a look at Netscape's Proxy Server... and Raptor's solution
was good last I used it.

You didn't say what kind of filtering they wanted (i.e. deny all except ...

That's because my colleagues hadn't yet returned from their visit to extract
some actual description from the customer.  James also suggests an external
service for maintaining a blacklist.


"O'Shea, Dave" <dave.oshea wilcom.com>
Way Back When, I wrote a hack to the CERN httpd proxy that searched for
various keywords within a URL, and blocked access to them. I seem to recall
that it didn't take a whole lot of work, just a bit of jiggering with the
httpd.config file. 

But this job might be more of a challenge, as different users are to
get different blocking rules.

--
##############################################################
# Antonomasia   ant () notatla demon co uk                      #
# See http://www.notatla.demon.co.uk/                        #
##############################################################



Current thread: