Dailydave mailing list archives

Re: w00t 08


From: Charles Miller <cmiller () securityevaluators com>
Date: Sun, 3 Aug 2008 19:46:36 -0500

Yea, probably the biggest problem, as Adam pointed out, is in  
academia, "workshops" are for all the papers that couldn't get into  
"conferences".
So WOOT this year ended up with second string academic papers and on  
the commercial side, talks that weren't going to be given at Black  
Hat.  I mentioned to Tal that it would be cool to have WOOT be a best- 
of from the commercial and academic sides.  Papers could be invited by  
the program committee which, besides me, is pretty awesome.  So it  
would be a re-broadcast of the best talks/papers of the previous  
year.  In this way, both sides get to learn from the best work of the  
other side.  However, according to Tal, there is no incentive for the  
academics to re-present previous works and in fact they may not be  
allowed to do so.  Another good idea ruined by academia ;)

Charlie


On Aug 3, 2008, at 1:51 PM, Adam Shostack wrote:

(Added Tal Garfinkel, who organizes WOOT.)

On Sun, Aug 03, 2008 at 11:57:00AM +0100, nnp wrote:
| On Sun, Aug 3, 2008 at 3:30 AM, root <root_ () fibertel com ar> wrote:
| > Dave Aitel wrote:
| >> These are not the papers you're looking for.
| >> http://www.usenix.org/event/woot08/tech/full_papers/
| >>
| >> Seriously, there's nothing there to scare an network offense
| >> professional. I don't think it's w00t's fault, either. I think  
the
| >> research communities are diverging into public and private, as  
this
| >> research gets more expensive to do.
| >>
| >> USENIX may not be the place for academic treatment of offensive  
security
| >> research. A friend of mine wonders if there's any future for  
academic
| >> treatment of the subject at all. He wonder's wistfully of  
course, since
| >> he likes academia.
| >>
| >> Anyways, either be scary or be silly. There's no middle ground  
here.
| >> It's a fundamental truth in this field: You're either in, or  
you're out.
| >>
| >> -dave
| >>
| >
| > Commercial security conferences don't have great academic value  
because
| > they are not peer reviewed (well, not reviewed by academic  
people) and
| > there are other much important academic journals like ieee, etc.  
that in
| > theory don't accept money in exchange for the publication of an  
article.
|
| I'd like to get everyone else's opinion/experiences with articles  
from
| so called 'peer reviewed' journals like IEEE and the rest. I've  
spent
| the past 8 weeks or so working on a project as a research monkey  
at my
| uni and spent the first few weeks pouring over journals etc. When it
| actually came time for implementation though I discovered a huge  
array
| of problems that had not been mentioned in the articles (and were
| presumably ignored as acceptable sources of error). When I contacted
| the authors requesting to see their software so I could determine if
| they had solutions to the problems I was either ignored or blown off
| with excuses like "we currently don't have the resources to make  
that
| available". In my opinion this brings all of their results into
| question when outsiders don't know exactly what sources of error  
they
| deemed acceptable. If some academics aren't bothering to release  
their
| software and their results are questionable then what purpose do  
they
| serve other than to fill pages in journals?
|
| So my question basically boils down to, how much reviewing actually
| goes on? i.e Do they run the software? Do they examine code or
| formulae? Or is it just a case of 'well it looks right'?

Let me answer the question, and then get a little philisophical.  I'll
mention that I've been on the program committee for both WOOT
workshops, because I think what Tal is trying to do is both worthwhile
and very hard.

Reviews vary enourmously by reviewer.  In an ideal world, a paper
contains enough information to reproduce results.  A reviewer may
choose to try to do that, or read what's there and critique it.
Either way, I think the papers are (on average) higher quality than
most presentations at hacker cons.   First, they're actual papers in
essay form, rather than slide decks.  Second, the goal of the review
process is to improve the paper.  Does it achieve that? Not always.
Reviews for a workshop like WOOT are faster than reviews for a large
conference like USENIX or Oakland.  This is, if you know the code,
reflected in the name of the venue.  Workshop papers are *expected* to
be lower quality than conference papers.  "That's what workshops are
for."

Papers in journals like Phrack and Uninformed are sometimes equal in
terms of quality, but have very different norms and expectations,
which can make reading them challenging for people outside the
community of practice.  Of course, the same thing applys to people
reading academic papers.  The word "practical" is my favorite pet
peeve.

I think it would be great to create a norm of releasing software and
datasets as part of publication.  I also think it would be great to
have norms of reading the work outside our own narrowly defined
schools of thought.  It's too easy to get a talk at a hacker con that
says exactly what an academic paper says, and vice versa.  I think
WOOT has the potential to help with that.

There's a huge potential for cross-fertlization, but
cross-fertilization requires people spend time first in a limbo.
Hackers are sort of used to getting paid, academics need publications
that will lead to tenure and grants, and more prestigious conferences
count for more than a workshop.  So everyone's motivation is against a
long-term payoff with low probability.

When I did the Silver Bullet podcast with Gary McGraw, we talked about
how in the 90s, he and Ed Felten and some other folks pushed for
actual software flaws to be acceptable as topics for academic security
papers.  Before that, there was even more math, and less applied.
(Andrew and I talk about this orientation issue in the New School as
well).

Changing both communities is going to take years of work and
dedication.  Ideally, what comes out is stronger for both.  Ideally,
we'll see powerful math and theory applied and getting beyond "just
validate input."  We'll see applied research which is more than "oooh,
look, a buffer overflow."  (Admittedly, both of these are rude
stereotypes.)

If you want to see academics publishing their research, start blogging
with title like "Academic paper foo is not reproducable."  Submit
short papers to the venues a paper was published in saying "I did X, Y
and Z, and it didn't work like they said.  I contacted them, and they
declined to share their data or software.  So their paper needs to be
fixed, and it's not clear how to do that."  It's hard for a program
committee to reject such a paper if it's done to a reasonable
standard.

This has gotten really long, sorry. To wrap up, I think that bringing
together communities is both expensive and often very worthwhile.
Please understand that what we're trying to do with WOOT will produce
those blank stares for a while, and then, perhaps people will say "I'm
confused.  Why are you doing it that way?"  Finally, I hope, it will
start producing collabrations that do really cool stuff to hard
problems.

Adam

_______________________________________________
Dailydave mailing list
Dailydave () lists immunitysec com
http://lists.immunitysec.com/mailman/listinfo/dailydave

_______________________________________________
Dailydave mailing list
Dailydave () lists immunitysec com
http://lists.immunitysec.com/mailman/listinfo/dailydave


Current thread: