Text archives Help


Re: [projectvrm] De identification of data


Chronological Thread 
  • From: John Havens < >
  • To: John Wunderlich < >
  • Cc: Adrian Gropper < >, " " < >
  • Subject: Re: [projectvrm] De identification of data
  • Date: Tue, 15 Jul 2014 13:04:15 -0400

John, you bring up excellent points and the black box case study is fascinating. 

It reminds me of the U.S. research showing how police negligence/violence has decreased so dramatically when wearing cameras on their uniforms that record their behavior as well as the people they're questioning, etc.  The data they're recording affects others, but also hinders potential negative behavior on their part.

That said, what gets tricky for me is your statement, "not all information about the person has commercial value."  I agree with this statement in one sense, although:
  • Data collected in 'black box' situations during surgery would seemingly be very valuable to health, pharma, and other medical industries. 
  • Wearables and Quantified Self data may correlate to data reflected during surgery.  It's early days for wearables that measure stress via sweat output in a smartwatch, for instance.  I'm thinking it would be fascinating to see what wearable data would happen during a surgery.  Certainly lawyers must be chomping at the bit for that to happen to point out potential surgical negligence, etc.
  • "Commercial" as a term, to your point on nuance, needs to be defined.  Data that wouldn't be purchased today may be highly valuable the more QS and other data sets become standard data channels for purchase.

Thanks again for your thoughts.  Really helpful.

JCH



The point of the above is that not all information about the person has commercial value.



On Mon, Jul 14, 2014 at 10:56 PM, John Wunderlich < " target="_blank"> > wrote:
Adrian;

I'm afraid I can't agree with your categorization of de-identifying data. A lot of the work that I do with de-identification is in the health sector where de-identified data is how individual privacy can be protected while enabling medical research (see here and here). Depending on the jurisdiction that one is in, the process of de-identification may be a 'use' of the personal information for which some kind of consent or legal authority may be required. Setting that aside for this discussion, once de-identified the data is no longer 'personal' information so providing that information to a researcher can hardly be called a disclosure.

Since I live in a country with a single payer medical system, I don't think I want to wade into a discussion of the expenditures in the U.S. healthcare system. We'd best leave that to a another forum;-)
​ ​
What I will say is that I sit on a Research Ethics Board that deals with a lot of clinical trials for cancer drugs, where many of the trial sponsors are drug companies. One assumes that there is a lot of money at stake for the companies, and the drugs are often very expensive, but all the trials are based on informed consent. I believe that this is normal in the U.S. as well.

And finally,
​I'd point you to this story here in Toronto about a 'black-box' in the operating theatre.​
​ It is an experiment for recording what happens during surgery, to enable quality controls and reviews in much the same way that black boxes are used in aviation. This seems clearly sensible from a process improvement point of view, and the information collected could be defined as personal information about the patient. This one will be interesting from a privacy perspective if it becomes standard practice.

The point of the above is that not all information about the person has commercial value. It's almost always of intererest or concern to the person who it is about, and in that sense has value. It may be useful for others for research or other reasons, but not in a commercial or a transactional sense. We need to be more nuanced, it seems to me, in where we apply the principles of VRM. This is valid for commercial relationships and exchange, but shouldn't be a universal rule for all information exchanges on the Internet.​



John Wunderlich
Privacist @PrivacyCDN


On 14 July 2014 18:25, Adrian Gropper < " target="_blank"> > wrote:
>
> De-identifying data is just a face-saving way of selling that which you don't own. Personal data has value and that value might or might not be shared with the service provider that helped create the personal data. Even so, the sharing still needs consent and, well, sharing.
>
> For the most part, the sale of de-identified data benefits only the service provider and de-identification is merely a tactic to avoid asking the person for consent. This is most evident in US healthcare where prices are opaque and quality measures non-existent. Healthcare supports a growing market for de-identified data that may well exceed $20 B at the same time that US consumers are being overcharged by $1 T compared to 10 other developed economies. Does anyone really think these two numbers are unrelated?
>
> Adrian
>
>
>
> On Monday, July 14, 2014, John Wunderlich < " target="_blank"> > wrote:
>>
>> If you approach de-identification as a binary set, where it is a failure if even one person out of a million might theoretically be re-identified, the argument might make sense. In a real world risk management scenario, it fails to be a reasonable or sustainable argument.
>>
>> Latest entry is here:
>>
>> http://www.innovationfiles.org/meet-the-new-de-identification-deniers/
>>
>> The debate is fairly heated. Full disclosure, I have worked for Ann Cavoukian and with Khaled El Emam.
>>
>>
>> John Wunderlich
>> Privacist @PrivacyCDN
>>
>>
>> On 9 July 2014 20:28, John Havens < " target="_blank"> > wrote:
>>>
>>> Found this very intriguing and wanted to get the group's opinion:
>>>
>>> http://boingboing.net/2014/07/09/big-data-should-not-be-a-faith.html?utm_content=buffere752f&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer
>>>
>>> Thanks!
>>>
>>> John Havens
>>
>>
>
>
> --
> Adrian Gropper MD




--
"More than at any time in human history, we have access to mountains of data about ourselves.  Hacking H(app)iness is the first book to show us how to leverage this information as a path to happiness, rather than a source of misery."

-Adam Grant, Bestselling author of, Give and Take





Archive powered by MHonArc 2.6.19.