Interesting People mailing list archives

Cato's Jim Harper replies to Solove-Hoofnagle privacy regulation proposal [priv]


From: David Farber <dave () farber net>
Date: Fri, 18 Mar 2005 01:53:43 -0500


------ Forwarded Message
From: Declan McCullagh <declan () well com>
Date: Fri, 18 Mar 2005 00:22:58 -0500
To: <politech () politechbot com>
Subject: [Politech] Cato's Jim Harper replies to Solove-Hoofnagle privacy
regulation proposal [priv]

Previous Politech message:
http://www.politechbot.com/2005/03/11/request-for-critique/

Jim makes some very reasonable points. Keep reading.

-Declan

-------- Original Message --------
Subject: RE: [Politech] Request for critique: a 16-point plan for
privacyregulation [priv]
Date: Mon, 14 Mar 2005 13:00:00 -0500
From: Jim Harper <jharper () cato org>
To: Declan McCullagh <declan () well com>
CC: <hoofnagle () epic org>, <dsolove () law gwu edu>

Declan, Chris, and Daniel:

This document has much to commend it.  Sections 12-14, in particular,
dealing with "Government Access to and Use of Personal Data" identify
serious problems that should be addressed.

As more and more people conduct more and more of their lives online,
Supreme Court rulings holding that there is no reasonable expectation of
privacy in data held by third parties grow further and further out of
synch with the true expectations of the people.

"Data mining" is an ambiguous term but, to the extent it means sifting
through databases trying to discover incipient wrongdoing, it is wrong,
probably ineffective, and a violation of the Fourth Amendment.
Investigators should use database information only subject to search
warrant or subpoena, as appropriate, to follow specific investigatory leads.

And, yes, the Privacy Act is a paper tiger. Congress should revise it,
especially in light of the end-run made possible by companies like
ChoicePoint who do the dossier-building that the Privacy Act is meant to
prevent. Even a revised statute, however, does not reach the source of
the problem: a large government with extensive tax and entitlement
programs will demand tremendous amounts of personal information from
citizens. The costs to privacy of government programs have gone
unconsidered as they have been enacted and expanded.

Because of their institutional incentives and unique powers, it is
appropriate to circumscribe governments' use of data more closely.
Everyone agrees (except for government agents and their hangers-on in
the surveillance-industrial complex) that governments are the greatest
threat to privacy.

The sections that address private-sector data use are quite a bit less
intuitive. They do not seem to emerge from a general theory of privacy.
Rather, they focus on "business" and "companies" - as if wrongful
collection, use, or publication of facts by a company is worse than the
same behavior by an individual.

(I'll admit my pro-corporate bias: I have started and today operate two
corporations, one for-profit (but not profitable) and one non-profit. I
do not know how I would have started or kept either one alive under the
regulatory regime needed to carry out the aspirations in this paper,
relying as they do on information about people.)

Essentially, these rules attempt to reverse the general rule - a rule
with foundations deep in physics and justice - that information flows
freely unless constrained by its author. The solution is not to reverse
or stop the flow of data streams, but to address harms caused by
wrongful data collection, use, or publication.

ChoicePoint has harmed people by subjecting them to identity fraud and
the expense, time, and embarrassment of trying to rebuild their
financial lives. It has also harmed various companies who in the main
will suffer the bulk of the financial repercussions. ChoicePoint should
pay for its negligence.

The case of Remsburg v. Docusearch provides a good example of how courts
adapt common law negligence rules to address dangers and harms in modern
circumstances. In that case, a data broker sold information about a
young woman to a man who ultimately murdered her. The New Hampshire
Supreme Court found that the data broker owed it to the woman to protect
her.

http://www.courts.state.nh.us/supreme/opinions/2003/remsb017.htm

ChoicePoint, likewise, owes a duty to all the people on whom it compiles
information, a duty to protect them (us) from harm. It appears that
ChoicePoint has failed in that duty, so ChoicePoint should pay. Lawsuits
alleging ChoicePoint's negligence have already been filed. Politicians
are grabbing headlines with hearings and legislation, but the real
action is quietly underway in the courts.

Recognizing a common law rule like this takes care of a lot of the
problems that would otherwise require long, detailed regulations. The
California law requiring consumer notice of data breaches has been given
too much credit in this case. ChoicePoint's compliance with the
California law broke the story, but it is both over- and
under-inclusive: it requires notice in cases where notice doesn't
matter, and it doesn't require notice in some cases that do.

Once data holders recognize their legal duty to protect backed by the
responsibility to pay, they will eagerly notify consumers when doing so
will avoid harm - because this will save them money. They will also
notify banks when that is appropriate, credit bureaus when that is
appropriate, and credit card issuers when that is appropriate - all in
direct proportion to need.

The genius of simple, general rules like this is that they capture the
self-interest of companies like ChoicePoint and direct it toward
consumer welfare rather than trying to reverse information flows or
weigh the economy and society down with unnatural, innovation-stifling
regulation.

Finally, many proposals in the document have a one-way privacy bias
(which is appropriate in the government context where choice is not
available). This bias will appeal to many, but it is not good public
policy for a country that respects the will and genius of the people.

Perhaps it is unfortunate, but my careful observation over many years
finds that consumers often prioritize things other than privacy - goods
such as convenience, lower costs, better customer service, and so on.
Indeed, they are sometimes just plain indifferent.

I am loathe to assume that the great mass of Americans are just wrong
and in need of caring for by self-appointed elites (even really smart
ones). We might like everyone to be more privacy conscious, but I do not
think it is wise to force privacy on people who would not otherwise
choose it.

The section on preemption of state law illustrates this bias. It would
allow states to institute more-comprehensive privacy protections, but
not less-comprehensive privacy protections. Given that individuals
choose less privacy protection all the time, I do not see why state
legislatures should be disabled from representing their people in a way
that may be perfectly rational. If there is to be legislation (and I
don't think it's needed), states should be fully able to innovate, not
just innovate in the federally preferred way.

For your consideration.

Jim



Jim Harper
Director of Information Policy Studies
The Cato Institute
and
Editor
Privacilla.org

_______________________________________________
Politech mailing list
Archived at http://www.politechbot.com/
Moderated by Declan McCullagh (http://www.mccullagh.org/)

------ End of Forwarded Message


-------------------------------------
You are subscribed as lists-ip () insecure org
To manage your subscription, go to
  http://v2.listbox.com/member/?listname=ip

Archives at: http://www.interesting-people.org/archives/interesting-people/


Current thread: