Educause Security Discussion mailing list archives

Re: Effectiveness of CTF Scoreboards


From: Bryce Porter <0000008467c11b85-dmarc-request () LISTSERV EDUCAUSE EDU>
Date: Fri, 22 Mar 2019 10:14:43 -0400

The folks at MAGIC, Inc. might have some thoughts to share:
https://magicinc.org/

They put on a semi-annual (twice each year) multi-site CTF competition for
education and municipal/community locations with a scoreboard that shows
team scores and site scores.  I believe their next event (CTF007 -
https://magicinc.org/event/capture-the-flag-007) is on April 13th, and
there may be a site that is close enough for you to attend and observe.
They may also be willing to share their scoreboard technology with you.

We took part in the November CTF event last year, and I observed it to be
very well assembled and managed.  We will likely participate again next
year, but we are skipping the Spring event this year due to other
priorities for our program.

Bryce Porter
Chief Information Security Officer
Information Technology Services
UNC Greensboro


On Fri, Mar 22, 2019 at 9:33 AM Don Murdoch <dmurdoch () regent edu> wrote:

Greetings,



              TL;DR short question: Is there a study that can be pointed
to that shows “CTF style user interfaces have positive measurable impact on
knowledge acquisition and learning permeance”? On a related question, is
anyone aware of a CTF event coming up in the next 3-4 months within a 150
mile radius of the Hampton Roads, VA region, so perhaps I could design a
study and actually measure effectiveness?



              Long Question w/ background:



              I’ve been searching a bit for a study that measures the
effectiveness of a CTF style scoring system, with the goal of measuring
effectiveness of using the CTF UI tool itself on the adult learner. The
phrase >> Effectiveness of CTF Scoreboards site:*.edu << in Google finds
some really nice papers that explain “this is what we did, how we
collected, here is the amazing infrastructure we built to support asynch
decentralized competition, lessons learned,.etc.”, but I haven’t seen an
answer to the question that measures if the CTF game tool (such as CTFd or
the FaceBook tool) had a measurable positive impact on learning – and more
importantly, an improvement on fact data and process to solve recall 30d
after the event.



              I often work with a Senior High teenager in a local NJROTC
unit, where AFCEA competition is all the rage, and I have anecdotally
observed that some things “he gets”, some things “he gets are hard”, and
some things “he should have got he did not”, and more importantly, there is
variable results in the coaching I’ve offered him in his ability to apply a
lesson in a CTF. This is highly anecdotal and and an error prone
observation, but it does prompt the question “does the CTF tool and
environment measurably improve performance”, and then “how can we measure
performance 30d later”?



              I’d say that there is certainly anecdotal evidence that
people “like CTF’s”, that “CTF’s help provide a score”, and that
organizations like SANS have used the NetWars platform to great effect.
After having done one myself, it was enormously satisfying to see that my
team got 510 of the 511 points 42 minutes before the next team, that we had
less people, and we had more people depart mid game. That’s “cool”, and
it’s a permanent memory. However, that “feeling” was not measured for
effectiveness in a longitudinal manner.



              On a related question, is anyone aware of a CTF event coming
up in the next 3-4 months within a 150 mile radius of the Hampton Roads, VA
region, so perhaps I could design a study  and actually measure
effectiveness? My initial thought is that you would want to measure fact
knowledge ahead of time, emotional and cognitive impact right after the
event so as not to disrupt the event, and then measure knowledge permeance
7d and 30d later by asking a question using the same UI elements and
measuring the analysis and response time. (following the philosophy
expressed in Make It Stick).



Don Murdoch, GSE #99

Assistant Director, Institute for Cybersecurity Direct: US 757 352 4588

Regent University <https://www.regent.edu/> | *Christian Leadership to
Change the World*

*New Book “Blue Team Handbook: SOC, SIEM, and Threat Hunting Use Cases: A
condensed field guide for the Security Operations team (Volume 2)” is now
Live on Amazon
<https://www.amazon.com/gp/product/1726273989/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1>*




Current thread: