Interesting People mailing list archives

re A Family's Horror -- and the Role of Google Images


From: Dave Farber <dave () farber net>
Date: Thu, 4 Feb 2010 18:18:13 -0500





Begin forwarded message:

From: Lauren Weinstein <lauren () vortex com>
Date: February 4, 2010 6:14:42 PM EST
To: Dave Farber <dave () farber net>
Subject: Re: [IP] re A Family's Horror -- and the Role of Google Images



Dave,

This is very interesting.

I continue to be mystified by the apparent inability of some readers
to get their heads around the concept I'm trying to explain.  Perhaps
it's a failing in my abilities at exposition, or maybe it's just that
preconceived notions in this area are so strong that they override
the written word.

Nowhere in ( http://lauren.vortex.com/archive/000677.html ) or the
links referenced therein where I discuss concepts for search engine
dispute resolutions have I called for bans, censorship, DMCA
take-downs, or anything of the sort.

To the contrary, I oppose censorship, and I have long maintained that
the reality of the Internet assures that it is very nearly impossible
to effectively censor content once it has been publicly posted
(whether oppressive governments can employ sufficiently draconian
techniques to force populations into cowered compliance is a related
but different issue).

Search engines -- Google in particular of course -- have enormous
power to determine what information actually is seen and in what
contexts, since they tend to be the de facto portal for so many
people's discovery of Internet sites and data.  (In fact, have you
ever seen the interesting phenomenon of people who even enter explicit
site domain names into a Google search field rather than entering them
directly into browser Location bars?  Fascinating.)

Search engine algorithms and classification routines are complex and
generally proprietary.  Google uses (so I've heard, anyway) something
on the order of 200 different inputs to help determine search
rankings.  Google is free -- as they should be -- to arbitrarily
change and tune those parameters, and those changes will represent
Google's views of the relative importance of the different inputs in
the overall ranking decisions.

Fundamentally, my argument is that there are additional inputs that
arguably are worthy of consideration in this process.  Does a stance
against censorship require that innumerable, easily identifiable
photos of an 18-year-old girl's headless corpse be quickly displayed
even when search engine results settings are explicitly set to their
strictest mode?

Similarly, does the ability of false and slanderous materials to rise
to the top of search engine results mean that it's impossible to
devise a system where aggrieved parties could have some similarly
ranked and visible forum to at least contest the information on such
sites?  These are the sorts of issues I've discussed in considerable
depth in the past, so I won't detail them all again now.

Except to say this:

The usual retorts (not by Google itself, but by some outside
observers) to such suggestions include the idea that ethical
considerations have no role in search results.  Yet Google itself has
implicitly acknowledged the role of ethics in results, as in their special
handling of searches for the word "jew":

http://lauren.vortex.com/archive/000255.html

Google makes other judgments as well, including (quite appropriately)
blacklisting sites that it believes are contaminated with malware.

My assertion is simply that there are a range of other "ethical"
factors that similarly would be appropriate for consideration in
ranking and (for example) SafeSearch classification decisions.

Even if this view is accepted as having some merit, the question of
scale comes up immediately.  Would creating a system for dispute
resolutions or other annotation of results in the manners I've
described be practical given the scope of the search universe?

Again, I won't repeat my previous writings on this, but I believe this
to be a problem completely capable of being solved -- if the will and
resources are put forth to do so.

One final point.  It is unwise to assume that the status quo is
stable, even if ethical considerations are set aside.  In fact, we
have more evidence every day that governments are prepared to take
often drastic and overbearing steps to try control the information
flow into their countries -- this group even includes some traditional
democracies of long standing.

My view is that calls for censorship and information bans, which again
I deplore, are best fought back by voluntary efforts -- not to
restrict or block information -- but rather to help make sure that
information is organized in ways that include the kind of ethical
components that I've described above.

Virtually by definition today, the bulk of that responsibility falls on
search engines in general, and on Google in particular.

And make no mistake about it, I'm convinced that Google can do a
fantastic job of solving these problems, if they choose to do so.

--Lauren--
Lauren Weinstein
lauren () vortex com
Tel: +1 (818) 225-2800
http://www.pfir.org/lauren
Co-Founder, PFIR
  - People For Internet Responsibility - http://www.pfir.org
Co-Founder, NNSquad
  - Network Neutrality Squad - http://www.nnsquad.org
Founder, GCTIP - Global Coalition
  for Transparent Internet Performance - http://www.gctip.org
Founder, PRIVACY Forum - http://www.vortex.com
Member, ACM Committee on Computers and Public Policy
Lauren's Blog: http://lauren.vortex.com
Twitter: https://twitter.com/laurenweinstein


- - -

On 02/04 17:06, Dave Farber wrote:


Begin forwarded message:

From: "Mike Tetreault, CISSP, CSSLP" <z0t5jtc02 () sneakemail com>
Date: February 4, 2010 4:31:46 PM EST
To: dave () farber net
Subject: Re: [IP] re A Family's Horror -- and the Role of Google
Images


Honestly, I'm confused by Lauren's comments. People have things they
consider "bad stuff". Different people have different things they
consider "bad stuff". If you want a company to decide what should be
"bad stuff", use a filtering proxy with automatically updated
blacklists. You are proposing that a company make a moral choice (ie, what's "good stuff", "okay stuff", and "bad stuff"), and further, to be
prepared with guidelines for making further moral choices in the
future.

Can we deploy measures? Yes. They're called guidelines, policies, and procedures. The pictures should never have been disseminated by those
authorized to create and possess them. Can we hold someone
accountable? Yes. This happens through the courts, which is what the
accident victim's family is doing. If an individual is acting on
behalf of an organization, you hold both the individuals and the
organization liable. Personally, I think CHP dropped the ball by not
summarily dismissing the responsible parties, and deserves and
sanctions the courts choose if only because of that. I can think of
few more egregious breaches of the public trust thank sending out
these images.

Now, actually removing this content is where the challenge lies. The
easiest way would be to use the DMCA to go after individuals that post the images. Transfer copyright to the family (or an entity controlled
by them) and let them start sending out the take down notices.

Mike



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/
Powered by Listbox: http://www.listbox.com




-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/
Powered by Listbox: http://www.listbox.com

Current thread: