Text archives Help


Aw: Re: [projectvrm] De identification of data


Chronological Thread 
  • From: "Graham Reginald Hill" < >
  • To: "John Wunderlich" < >
  • Cc: " " < >
  • Subject: Aw: Re: [projectvrm] De identification of data
  • Date: Tue, 15 Jul 2014 09:09:46 +0200
  • Importance: normal
  • Sensitivity: Normal

Hi John
 
I agree with you entirely. 
 
The surgical black box as a way to continuously improve medical practice is an interesting application of lean thinking. It wouldn't be the first time that ideas originally developed by Deming, Juran and others, later perfected within the Toyota Production System have been applied in medicine. There have been many cases where, for example, the expertise honed by countless hours of practice by Formula One pit-stops has been applied to improving medical procedures (see http://www.independent.co.uk/life-style/health-and-families/health-news/doctors-call-in-fi-team-for-help-in-accelerating-transfer-of-sick-children-622030.html for one example). 
 
The question about the value of data is an interesting but complicated one. From a rather simple axiological perspective I would argue that data has no intrinsic value in and of itself, but that it may have instrumental value when it is used to create an outcome. (And let us not forget that value is only tangentially related to economic utility). This is relatively trivial when the data in question is not about an identifiable person (as in properly anonymised data, to which most Data Protection Regulations do not apply), but becomes much more complicated when a person is identifiable from the data. Should a person be able to prevent others using data about them for the others' purposes? Potentially, but not necessarily in all cases. I see no reasonable grounds to deny Amazon the right to analyse my book buying behaviour at its website and to make future purchase recommendations to me, although I do not wish my data to be shared with third-parties without my permission. Should a person be able to claim some of the value created when data about them is used to create an outcome? Probably not, but potentially in some cases. I see no reasonable grounds to claim a share of any price reductions that Amazon is able to negotiate frompublishers, although I would hope that it is reflected in lower prices. These are simple cases that don't even start to think through the challenges introduced by a consequentialist utilitarian perspective. Perhaps the VRMProject Forum should mug-up on Axiology in order to answer these difficult questions.
 
Best regards from Cologne, Graham
-- 
Dr. Graham Hill

UK +44 7564 122 633
DE +49 170 487 6192
http://twitter.com/GrahamHill
http://www.linkedin.com/in/grahamhill
http://www.customerthink.com/graham_hill

Partner
Optima Partners
http://www.optimapartners.co.uk

Senior Associate
Nyras Capital
http://www.nyras.co.uk
 
 
Gesendet: Dienstag, 15. Juli 2014 um 03:56 Uhr
Von: "John Wunderlich" < >
An: "Adrian Gropper" < >
Cc: "John Havens" < >, " " < >
Betreff: Re: [projectvrm] De identification of data
Adrian;

I'm afraid I can't agree with your categorization of de-identifying data. A lot of the work that I do with de-identification is in the health sector where de-identified data is how individual privacy can be protected while enabling medical research (see here and here). Depending on the jurisdiction that one is in, the process of de-identification may be a 'use' of the personal information for which some kind of consent or legal authority may be required. Setting that aside for this discussion, once de-identified the data is no longer 'personal' information so providing that information to a researcher can hardly be called a disclosure.

Since I live in a country with a single payer medical system, I don't think I want to wade into a discussion of the expenditures in the U.S. healthcare system. We'd best leave that to a another forum;-)
​ ​
What I will say is that I sit on a Research Ethics Board that deals with a lot of clinical trials for cancer drugs, where many of the trial sponsors are drug companies. One assumes that there is a lot of money at stake for the companies, and the drugs are often very expensive, but all the trials are based on informed consent. I believe that this is normal in the U.S. as well.

And finally,
​I'd point you to this story here in Toronto about a 'black-box' in the operating theatre.​
​ It is an experiment for recording what happens during surgery, to enable quality controls and reviews in much the same way that black boxes are used in aviation. This seems clearly sensible from a process improvement point of view, and the information collected could be defined as personal information about the patient. This one will be interesting from a privacy perspective if it becomes standard practice.
 
The point of the above is that not all information about the person has commercial value. It's almost always of intererest or concern to the person who it is about, and in that sense has value. It may be useful for others for research or other reasons, but not in a commercial or a transactional sense. We need to be more nuanced, it seems to me, in where we apply the principles of VRM. This is valid for commercial relationships and exchange, but shouldn't be a universal rule for all information exchanges on the Internet.​



John Wunderlich
Privacist @PrivacyCDN


On 14 July 2014 18:25, Adrian Gropper < " target="_parent"> > wrote:
>
> De-identifying data is just a face-saving way of selling that which you don't own. Personal data has value and that value might or might not be shared with the service provider that helped create the personal data. Even so, the sharing still needs consent and, well, sharing.
>
> For the most part, the sale of de-identified data benefits only the service provider and de-identification is merely a tactic to avoid asking the person for consent. This is most evident in US healthcare where prices are opaque and quality measures non-existent. Healthcare supports a growing market for de-identified data that may well exceed $20 B at the same time that US consumers are being overcharged by $1 T compared to 10 other developed economies. Does anyone really think these two numbers are unrelated?
>
> Adrian
>
>
>
> On Monday, July 14, 2014, John Wunderlich < " target="_parent"> > wrote:
>>
>> If you approach de-identification as a binary set, where it is a failure if even one person out of a million might theoretically be re-identified, the argument might make sense. In a real world risk management scenario, it fails to be a reasonable or sustainable argument.
>>
>> Latest entry is here:
>>
>> http://www.innovationfiles.org/meet-the-new-de-identification-deniers/
>>
>> The debate is fairly heated. Full disclosure, I have worked for Ann Cavoukian and with Khaled El Emam.
>>
>>
>> John Wunderlich
>> Privacist @PrivacyCDN
>>
>>
>> On 9 July 2014 20:28, John Havens < " target="_parent"> > wrote:
>>>
>>> Found this very intriguing and wanted to get the group's opinion:
>>>
>>> http://boingboing.net/2014/07/09/big-data-should-not-be-a-faith.html?utm_content=buffere752f&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer
>>>
>>> Thanks!
>>>
>>> John Havens
>>
>>
>
>
> --
> Adrian Gropper MD
 



Archive powered by MHonArc 2.6.19.