Educause Security Discussion mailing list archives

Re: Vulnerability Scanning Problem


From: Mike Wiseman <mike.wiseman () UTORONTO CA>
Date: Wed, 13 Dec 2006 16:13:31 -0500

We've been using both NAC and vulnerability scanning (Nessus, nmap) together for about two
years - scanning is done on all subnets while NAC is used for residence and wireless users
only. The perception now is that NAC has done it's job - network operations reports more
exploited desktops in administrative and academic (IT managed) subnets than the
NAC-covered environments. Part of the problem with the managed networks is the large
number of Win2K desktops still in service, patch maintenance falls behind for some reason
and one day we find two to three hundred machines scanning the environment. This is a
situation where the vulnerability scan helps - we get a quick list of vulnerable Win2K IPs
and pass that on to the department admin.

We are also under the perception that the threat of exploits triggered by careless web
browsing as an unrestricted user is significant enough to make the patch/AV status
checking by NAC important. This may also contribute to the higher exploit rate on the
admin subnets. We continue to be surprised by the number of users subject to the NAC test
who have fallen far behind in patching - even in these days of Security Center
notification, Auto Updates, etc.

A quick plug for our NAC - we developed it in-house around the Microsoft MBSA tool,
NetReg, and the WMI tools. Have a look at:

http://www.utoronto.ca/security/UTORprotect/ESP/index.htm

Mike


Mike Wiseman
Computing and Networking Services
University of Toronto


Thanks for your thoughts Russell. Nessus, Retina, etc. seem useful for
known vulns in network-facing systems, and this is important. However, I
am increasingly concerned about client-side vulns, and without
credentials to a system or an agent how do you easily test for those
(NAC/agent technologies is one possible solution). On the cheap, a SPAN
port with a passive fingerprinter might give *some* value but I'm more
interested in being able to perform something similar to what the new
Secunia software inspector performs. (I'm not affiliated with secunia)

http://secunia.com/software_inspector/

I don't like the idea of having common authentication credentials on an
array of systems for deeper host checks by a network assessment service
(risk of cracking and/or interception), but it would be really helpful
for something like the secunia app to be easily scalable across a large
and rapidly changing .edu environment.

The secunia app btw is helpful in that it clearly enumerate a variety of
client-side apps such as flash, quicktime, realplayer, java runtime, and
the like along with the various MS and office checks. Some of the apps
that it found on some systems I ran it on were not able to be easily
uninstalled (some versions of flash OCX's for instance that required
some tweaking of NTFS permissions, even as Administrator, to be
removed). Also, the various instances of Java runtime that do not
uninstall when you upgrade, leaving older versions laying around which
could potentially be leveraged for an attack (saw a paper or something
on this topic once, using a hostile applet to exploit an older version
of the JRE).

With all of the various 0days floating around, and the average .edu
end-user situation I think more needs to be done to beef up client-side
security. I know there are vendor solutions out there for this but I
always like to leverage lower costs options when possible.


Current thread: