Interesting People mailing list archives

IP: From Dyson -- Re:Summit to Discuss Global System for Rating Internet Content


From: David Farber <farber () cis upenn edu>
Date: Fri, 10 Sep 1999 08:01:43 -0400



To: David Farber <farber () cis upenn edu>
From: edyson () edventure com (Esther Dyson)


My response (which you can post to IP if you want). I'm speaking here;
Teresa Peters is also here..

Esther

Response to Bertelsmann Foundation memo

This response is my personal reaction to the Bertelsmann Foundation
Memorandum on "Self-Regulation on the Internet."  I applaud the Foundation's
attempts to deal with a tough issue, and this response is intended
helpfully, not destructively.  Like it or not, many forces want to "regulate
the Net," whatever that means, and "self" regulation is probably better than
regulation by some "non-self" authority.  However, it's not clear who the
"self" is to be in this case……

Overall, the document leaves me feeling distinctly queasy.  So much of it
defers details for  implementation later.  Such and such must be done: Who
will do it?  Illegal content: Sure, we're all against illegal content, but
who decides what is illegal? There are too many questions left open to be
answered by some legitimate authority later on.

The basic problem is that the group is attempting to come up with a global
solution, topdown.  But the nature of the world is that it is a collection
of sometimes interacting communities, not a single global administration to
be governed top-down. In some spheres there is need for coordination and
collaboration, but it does not necessarily need to be governed globally.

Of course, the idea is that the system for Net content regulation would be
run by well-meaning, enlightened individuals who know what is best for
everyone. But what happened to the notion that people know what is best for
themselves and their children? What happened to regulation by citizens
themselves of the content they choose for themselves or their children,
rather than regulation by a "self" of industry entities beholden to their
governments?

The document proposes the creation of a full, broadly integrated set of
institutions that can "protect" us all from the problem of illegal content.
I fear that we will end up with a worldwide bureaucracy always forced to
take the "safe" route, calling for the removal of questionable content.
ISPs are properly relieved of responsibility for actions against their
customers; let the worldwide content-rating system take the heat. It will
take the heat, and dismiss it, because after all it is protecting the
public, and a few mistakes here and there are inevitable.

Now let me consider some details.

Illegal content


What is illegal content? Throughout the document, the writers (not named)
refer to "illlegal content such as child pornography."  If there is any
content other than child pornography that they think should be  illegal, the
authors should have the courage to specify what they mean. They make one
broader reference: "….racist and discriminatory web sites, child pornography
material exchanged in certain newsgroups and chatrooms and 'how to'-guides
on terrorist activity are too disturbing to ignore. Mechanisms have to be
developed to deal with illegal content, to protect children online as well
as free speech."  But that is all.

Later, there is some expectation that "illegality" of content will be
determined in the home territory of the publishing Website, and will be
taken down in accordance with that territory's laws - and presumably by its
law-enforcement officials.  But in the  world of the Internet, with mirror
sites, anonymous e-mail and the like, this may not be feasible - fortunately!


The proposed rating system

The proposed rating system, with its three layers, is nicely designed.  The
idea is to encourage sites to rate themselves, using some common vocabulary,
and then to encourage second parties to create rating "templates" with
combinations of various metrics and that vocabulary to reflect their values.
Finally, a third set of raters should make specific whitelists of acceptable
sites - acceptable to children, mostly - beyond the more abstract criteria
of the second layer. In theory, that neatly eliminates the value issue from
self-rating.

However…… First, in the more detailed rating section, the authors propose
that the vocabulary be created by an international group of experts of high
integrity: "In addition to experts on civil liberties and Internet policy,
the board should include social scientists who can advise about what kinds
of content are more and less harmful to children."  What are these social
scientists doing defining a value-free vocabulary? Surely they belong only
in the second layer…

Second, a global vocabulary is inherently limiting and too constrained.
It's a matter of emphasis, but the value is in the third layer, where people
make editorial choices. Otherwise, where's the appreciation of quality, of a
sense of humor, respect for the truth?  Surely children need to be protected
from bland junk as well as from trashy or harmful junk.

Moreover, the focus on protecting children seems excessive. Perhaps it is
this focus that makes the idea of almost universal filtering politically
palatable. Surely people will have other motivations for filtering, but they
might not want to use a filtering system as blunt as this one.  Personally,
I'd like to see a rating system for truthfulness, for disclosure of
advertising relationships, for bias, for political leaning, for assumed
audience.  (Is the site for techies or for consumers?)

At least there's a provision that unrated sites would not automatically be
excluded by most filters.

Child pornography vs. children viewing pornography

The report seems to gloss over the distinction between child pornography, a
legal term that connotes the use of children in pornography, which is
(almost) universally illegal. This generally involves abuse of actual
children, and content on the Net is evidence of the actual abuse of
children. This is quite different from the viewing of pornography (on or off
the Net) by children, which is almost certainly harmful in excess (like
almost anything in excess) but is quite a different matter.

Privacy issues

I also have some concerns over the report's attitude to privacy protection -
and implicitly, to anonymity. It is important to catch criminals, but we
need to maintain a balance among society's various needs.  There is a
suggestion that the Internet industry (broadly defined) should "tak[e] all
commercially reasonable steps to verify the identity of subscribers, while
protecting subscribers' privacy."  That does not seem to be necessary:
Abusers can be shut off without their identities being known; persistent
abusers will eventually become identifiable.

Constructive criticism

So what are alternative, positive approaches?  First of all, private groups
such as the Bertelsmann Foundation are doing the right thing by getting
involved in this debate.  They should raise people's awareness of the issues
and encourage them to think for themselves - and to pick content for
themselves and to offer content-rating services or choices to others.
Bertelsmann should encourage private groups and companies to develop and
promote rating services, not just for porn or violence, but for quality,
advertising disclosure, data-collection-and-use practices, and the like.
These services, like many services designed to "solve problems," are a huge
business opportunity.

Bertelsmann should also encourage widespread consumer-education campaigns,
led not just by foundations and governments, but also by companies (known as
"advertising").    Just as consumers look for price, nutritional
information, fabric content, care instructions, warranties and and other
information on products, so should they be encouraged to look for similar
meta-information on Websites.

In short,  let's look at the role that informed, empowered citizens can play
in keeping the Net a place they want to live in.





At 06:38 pm 09/09/1999 -0400, David Farber wrote:

September 9, 1999



Summit to Discuss Global System for Rating Internet Content




Esther Dyson                   Always make new mistakes!
chairman, EDventure Holdings
interim chairman, Internet Corp. for Assigned Names & Numbers
edyson () edventure com
1 (212) 924-8800
1 (212) 924-0240 fax
104 Fifth Avenue (between 15th and 16th Streets; 20th floor)
New York, NY 10011 USA
http://www.edventure.com                    http://www.icann.org

High-Tech Forum in Europe:  24 to 26 October 1999, Budapest
PC Forum: March 12 to 15, 2000, Scottsdale (Phoenix), Arizona
Book:  "Release 2.0: A design for living in the digital age"


Current thread: