Text archives Help


Re: [projectvrm] Can you really close Pandora's box


Chronological Thread 
  • From: Joe Andrieu < >
  • To: Crosbie Fitch < >
  • Cc: ProjectVRM list < >
  • Subject: Re: [projectvrm] Can you really close Pandora's box
  • Date: Sun, 05 Jun 2011 19:43:03 -0700

On 6/5/2011 4:32 PM, Crosbie Fitch wrote:
" type="cite">
From: Michael O'Connor Clarke
Am I completely missing something obvious?
    
Let me know what it is if you are.

The problem is how to tell someone your favourite colour is blue AND to be
able to control whether:
A) they can record that fact
B) they can refer to that fact in future business with you
C) they can communicate that fact to anyone else
  
This is essentially right. The problem is that to use any information, it must be revealed. Websites can't personalize your web experience unless they actually decrypt the data they use for such personalization. Pandora can't include your friend's musical tastes as part of a recommendation algorithm for you unless they decrypt your "friends list" and your friends' listening history and ratings.

You can and should use crypto for data in transit (https by default). This is one of the requirements of the Massachusett's data protection law 201 CMR 17.00. However, it can be untenable to store it encrypted when you need to use the unencrypted data for large scale data mining & analysis. Things like a search index simply wouldn't be possible on encrypted data, at least not in any way that actually maintains any meaningful sense of encryption.

In fact, to Mary's point, 201 CMR 17 is a great example of trying to define a data security standard. I haven't read too many critiques of it, but it does map out a set of "Schlage lock" requirements for service providers, at least for PII of Massachusett's residents.
" type="cite">
Of course, no such control is possible, or rather the control you do have is
whether or not you reveal your connection to the trading identity you have
created for this purpose - and if you don't, then it may not greatly matter
what anyone does with the information revealed by this identity in its
process of doing business.

Most people appear to assume that one uses one's human identity and has
magic/I mean sufficiently advanced technology able to control what others do
with the information communicated to them. I expect such technology will
arrive at about the same time as undefeatable DRM.
  
I don't believe that's correct, at least in so far as you are referring to the VRM community. I, for one, don't believe we can directly control the use or abuse of data. Rather, I believe that law abiding companies will honor agreements and laws regarding the use of personal data. We can't prevent companies from cheating us. We can make it costly if they do.

That's the nature of society. You can't prevent most crimes, even if physically present. Speeding. Tax evasion. Murder. Instead, society has a legal system that penalizes certain behavior. Similarly, confidential information is disclosed all the time under agreements--contractual, statutory, and regulatory--that explicitly bind the receiving party to specific duties of care and penalties if they breach those duties. This includes doctor/patient and lawyer/client confidentiality, and even simple NDAs between businesses in a sales conversation.

A Trust Framework would be another such set of agreements. Voluntary and opt-in. I have no illusions that we can literally control what others do with the information they have about individuals, as if they were puppets to our puppeteer strings. I do, however, believe we can bind organizations to act reasonably, if we can just figure out what is reasonable.

-j

Joe Andrieu

 
 ">
 
+1 (805) 705-8651





Archive powered by MHonArc 2.6.19.