Security Basics mailing list archives

Re: CISCO ACLs.. Are there lists already out there to protect me from trojans and known bad sites?


From: Chris Davis <davisfactor () gmail com>
Date: Tue, 15 Nov 2005 23:10:38 -0600

On 11/10/05, Dave Bush <hockeystatman () gmail com> wrote:
On 11/9/05, Christopher Carpenter <ccarpenter () dswa net> wrote:
Look at it the other way.  You want to DENY ALL, then ALLOW SOME.  Block
all ports and IPs, and then grant access to the ones you need.

If you ALLOW ALL, DENY SOME you will end up fighting a losing battle
creating ACL after ACL.

I concur with Chris. Cisco best practices are to always deny all and
only allow what you absolutely need in. Won't replace a firewall, but
will at least help.

I'd think if you're already blocking all and only letting in what you
need via your ACL rule set that you might need a network based IDS/IPS
as your next step behind the router to catch / block worm / virus
traffic.

--
Dave Bush <hockeystatman () gmail com>

There are two seasons in my world - Hockey and Construction


First, forgive me for responding to this specific email. I can't seem
to locate any prior emails in this thread.  And without knowledge of
any previous emails, I can only offer what I think the original author
is requesting.

I can't speak for blocking worms or trojans, but obviously you're not
going to be able to block specific websites on your router/firewall. 
One solution you might research is setting up an HTTP proxy that
allows you to filter based on specific URL.  Then you could use some
of the ready-made lists that a few websites offer as your block list,
along with a collection of your own blocked URLs.

This is one of the resources I use: http://www.mvps.org/winhelp2002/hosts.htm
They provide a hosts file designed for putting on your client
workstations to "block" malicious sites.  What I did on my network was
I setup a Linux server, installed Squid as a non-caching web proxy,
blocked all outbound ports on the firewall and only allowed the Squid
server outbound to ports 80, 443, and a few others.  Then I wrote a
small script that downloads the HOSTS file from that website and
converts it to a format Squid will recognize and put it in cron to
update every morning.  I also have it send me an email afterwards with
the total amount of sites in the list and I think at last count it was
up to a little over 8900.  Like I said earlier, I also keep a seperate
block list for sites that I manually want to block which isn't
affected by my automatic script.

I even wrote my own custom denied page that matches my company's
Intranet design.

That might sound like a big hassle but it's extremely effective.  My
squid box isn't very beefy and currently proxies roughly 150
workstations and servers.  I use Sarg (link on the Squid site) to
provide statistics for my Squid logs.

I would be more than happy to share my scripts with anyone if interested.


Current thread: