Educause Security Discussion mailing list archives

Re: Executive IT Security Report


From: David Earley <earleyd () ISC UPENN EDU>
Date: Wed, 4 Feb 2015 20:39:46 +0000

Well said, to both Jim and Brad. I feel it’s easy to generate and report on numbers and statistics, but in the absence 
of context and impact your stakeholders will be left wondering “Great, but so what?” Humbly acknowledging your 
strengths, but also your limitations can go a long way towards building trust and confidence towards your statements 
(and requests) in the future.

I’ll also second the notion of finding a key detail or scenario to hone in on, even if it’s just a single statistic or 
event. Metrics provide the necessary foundation for your reports, but your audience is highly unlikely to remember most 
of them 10 minutes after your presentation. From someone who’s been in your shoes (and had to generate that kind report 
3 times a week(!)), finding those one or two key takeaways for each report is crucial towards keeping your report 
fresh, and more importantly, *relevant* in the minds of those who read it.

Good luck!

Cheers,

David James Earley
Sr. Information Security Analyst
Information Systems & Computing
University of Pennsylvania
Office: 215.573.4070




From: The EDUCAUSE Security Constituent Group Listserv [mailto:SECURITY () LISTSERV EDUCAUSE EDU] On Behalf Of Jim 
Dillon
Sent: Wednesday, February 4, 2015 13:54
To: SECURITY () LISTSERV EDUCAUSE EDU
Subject: Re: [SECURITY] Executive IT Security Report

Dean,

I think Brad’s suggestions are good and I’ll add another angle.  One caveat is that I’d slightly alter his statement on 
“within your direct control or influence” to be “potentially within your direct control or influence.”  Stopping 
external probing of your defenses isn’t likely within your control, but changing network architecture to limit its 
effect could be, even if not something you currently have authority to pursue.  If it is a risk that can be mitigated 
with investment in new procedures, I think that is realistic and relevant.  Pointing out you aren’t currently able to 
address the risk is perfectly viable information for decision making purposes.  Don’t be self-serving in pointing these 
things out, make sure they are relevant to stated or compulsory objectives!


Maybe this isn’t part of your “monthly” metrics, but a realistic examination of what the metrics mean to significant 
university objectives will certainly garner more attention.

For example if the metric can demonstrate risk to an institutional “big data” project or funding or research campaign, 
then a little practical commentary (non-metric) will turn metrics into tangible reality.  Many of the folks you want to 
support are decision makers who make the decisions on where to invest their funds.  Your ability to tie the security 
activity to their success and demonstrate the impact of security failure to their primary objectives will go a long way 
towards being “meaningful.”  (e.g. metrics demonstrating data integrity concerns over datasets demonstrates clear 
impact to “big data” objectives.  If you show that targeted attacks are occurring against key assets to the big data 
effort, that is meaningful.  If you present it generically it may not capture the same interest.)

As Brad suggested, typical security FUD may not be meaningful to many of these types of leaders, managers, and 
governors, impact to their prize activities will.  This is clearly not a metric and requires some analysis and 
first-hand exposition, so maybe you add this as a quarterly objective?

Similarly, real data on real events is often compelling, particularly when you can demonstrate the impact to key 
objectives.  The challenge is presenting it in a realistic and tangible way.  Sorry this doesn’t help a lot creating a 
simple metrics based reports, but it is something that I think achieves your objective of being “meaningful.”  It does 
require a more active engagement and not simply the production of numbers and statistics.

A simple one line summary of the angle I’m supporting is, “Ensure that your metrics are meaningful to the institutional 
objectives and outcomes that matter most to your decision makers.”  In other words  be part of the outcome 
conversation, not just the “security function” conversation.

Best regards,

Jim Dillon



Jim Dillon, CISA, CISSP
Director of IT Audit Services, CU Internal Audit
University of Colorado
Primary Phone and Messages: 303-735-7028
Grant Street Phone: 303-837-2201
Audit Administration: 303-837-2195
Fax: 303-837-2190
jim.dillon () cu edu<mailto:jim.dillon () cu edu>
[Description: R:\Branding\MASTER\System\Logos\cu-logo_fl.jpg]

From: The EDUCAUSE Security Constituent Group Listserv [mailto:SECURITY () LISTSERV EDUCAUSE EDU] On Behalf Of Brad Judy
Sent: Wednesday, February 04, 2015 9:34 AM
To: SECURITY () LISTSERV EDUCAUSE EDU<mailto:SECURITY () LISTSERV EDUCAUSE EDU>
Subject: Re: [SECURITY] Executive IT Security Report

I don’t have a great dashboard or template for you, but I’ll kick in my two cents on this type of reporting.


1.       Any metric must somehow be a meaningful representation of reducing some aspect of institutional risk.  If you 
can’t articulate how a metric makes a meaningful impact to institutional risk in a realistic way (no FUD, security 
theater, etc.) then don’t include it.

2.       Any metric must be something within either your direct control or influence

a.       Never report on things that are mostly random number like “number of attacks” – such things ebb and flow and 
don’t demonstration your team/program’s effectiveness.  While some security reports talk about metrics like “time to 
detect an incident”, such a metric is a hybrid factor combining security monitoring and the sophistication of the 
attackers.  If all attacks were equal, it’s a measure of your security program, but they aren’t.

b.      What is a metric that trends in a particular direction when your program/team is successful?  Do they have any 
projects/initiatives that have a good metric for progress towards a goal?

                                                               i.      Percentage of systems that meet a baseline 
security standard (within a realistic scope – maybe central IT servers or all servers)

                                                             ii.      Percentage of laptops with whole disk encryption

                                                            iii.      Percentage of systems/servers participating in X 
security effort (DLP, authenticated vulnerability scans, centralized logging, etc.)

                                                           iv.      Number of PII records stored in the ERP system (if 
you have a goal of risk reduction via data reduction)

                                                             v.      Percentage of high/critical vulnerabilities 
patched or mitigated within your standard for patch window (may require fairly sophisticated config 
monitoring/management to measure accurately).

Keep the list of metrics short and ensure each has a 2-3 sentence description of how it reflects institutional risk 
reduction in a tangible and realistic way.  Each should also note a goal level to achieve or maintain.

Brad Judy


From: The EDUCAUSE Security Constituent Group Listserv [mailto:SECURITY () LISTSERV EDUCAUSE EDU] On Behalf Of Dean 
Halter
Sent: Wednesday, February 04, 2015 8:26 AM
To: SECURITY () LISTSERV EDUCAUSE EDU<mailto:SECURITY () LISTSERV EDUCAUSE EDU>
Subject: [SECURITY] Executive IT Security Report

We are being asked to provide our senior management with a meaningful monthly report to demonstrate how we are doing 
currently and improvement over time with respect to IT security.  Have any of you identified a good set of metrics you 
use for this purpose?  If any of you have a report that you use for this purpose that you would be willing to share, it 
would be greatly appreciated.

Thanks,
Dean
___________
Dean Halter, CISA, CISSP
IT Risk Management Officer, UDit
University of Dayton

"Security is a process, not a product."  Bruce Schneier


Current thread: