Text archives Help


Re: [projectvrm] Transparency: Watching and Watchers


Chronological Thread 
  • From: Mei Lin Fung < >
  • To: Adrian Gropper < >
  • Cc: David Brin < >, Colin Wallis < >, Tom Crowl < >, John Wunderlich < >, Doc Searls < >, Edwin Lee < >, ProjectVRM list < >
  • Subject: Re: [projectvrm] Transparency: Watching and Watchers
  • Date: Mon, 2 Jan 2017 08:45:22 +1100

I'm in the boat too - and from comments it sounds like others are. Yet it is more subtle and nuanced a dance between privacy and transparency than "all or nothing" for either concept.

Transparency is the facet that makes Networked improvement communities work (Douglas Engelbart originated the term). Of which a prime example is the open source community and another one operating at scale is described in this paper.

Transparency made the scientific revolution possible, as people described their hypotheses and the evidence for and against each one.

When a person is invited to be involved in scientific research there are protocols. Can we consider this discussion of privacy and transparency as one requiring protocols - just like TCP/IP - but going further into the protocols for interacting with people, and keeping people at the center of what we want to do in the future?

The Internet has impact in two very different ways on the health of people. One way is well recognized - communication between people, machines and other people is cheaper, faster and more reliable by orders of magnitude than the past and continued improvements are expected into the foreseeable future. The other is that feedback on all scientific and social hypothesis related to health is more comprehensive, more granular, more frequent, closer to the source, and more holistically measureable than ever before and like the first way,  continued improvements are expected into the foreseeable future.

Yet traditional research has been slow to take advantage of the new possibilities which will open broad new avenues for research 

Mei Lin


On Thu, Dec 29, 2016 at 10:09 AM, Adrian Gropper < " target="_blank"> > wrote:
It seems that Brin and I are in the same boat. Is it really just us two?

On Wed, Dec 28, 2016 at 5:57 PM, David Brin < " target="_blank"> > wrote:
The only conceivable way that we will have privacy - any at all - in a few decades, is if we get near universal transparency first.  

If that causes cognitive dissonance... if that doesn't seem to make sense... then the answer is simple.  You have not read or considered or discussed the full range of implications.  You prefer short term preachings over looking at what actually works. You do not belong in this topic, nor do you take it seriously.

With cameras getting smaller, faster, cheaper, better, more numerous and more mobile at exponential rates, and with new human biometrics being discovered monthly... pray tell me how you  expect to solve these problems, other than by empowering sovereign citizens to use and be empowered by such tech, deterring the nosy, themselves?

db


On Wednesday, December 28, 2016 2:22 PM, Colin Wallis < " target="_blank"> > wrote:


To all those good points, the reason we, as a community, are developing the tools and technologies mentioned (and more besides) is because transparency isn't exactly a menu item ..
'Can I have transparency with that please? Ooh and yes, I'll also have some privacy on the side, thanks'..  
:-)..


On Wed, Dec 28, 2016 at 7:48 PM, Tom Crowl < " target="_blank"> > wrote:
I keep thinking about hunter-gatherers. Very little privacy. Whole lot of transparency. Fair level of reciprocal accountability (never to hard to whack an idiot chief with a rock when no one's looking.)

Not suggesting we're going to return to those levels of privacy... but maybe less than we've been used to in recent history.
Or will we return to those levels of transparency... Do I really need to see Trump doing #2?... but likely more transparency than some would like.

The real question is how to whack the chief with a rock when necessary?


On Wed, Dec 28, 2016 at 11:15 AM, John Wunderlich < " target="_blank"> > wrote:
Folks;

Privacy and Transparency, when put together in an argument or phrase like this are, it seems to me, complementary vectors for addressing informational power imbalances. 

Privacy is what enables the powerless, or the regular person/user/customer/patient/c itizen, to go about their daily lives free to make decisions and act in the world unencumbered by surveillance - whether it is commercial or political - from the powerful. We should note the inevitably close connections between large organisations and the state make the line between commercial and political blurry at best. The absence of privacy has a chilling effect and reduces the amount of agency that people have in the world (real or digital, commercial or social). I would assert that privacy is, therefore, essential to a free and democratic society. 

Transparency, on the other hand, is what enable the powerless (person/user/customer/patient/ citizen) to hold the powerful to account. Unless the powerful can be held to account they are free to exercise their power arbitrarily and according to their own, potentially self-serving, definitions of what is appropriate or desirable. Like privacy, transparency is essential to a free and democratic society.

Parenthetically, I would counterpose this view of privacy and transparency as separate but related concepts to Brin's concept of them as reciprocal. My reading of Brin's work is that a loss of privacy (from the observed's point of view) is an increase in transparency (from the observer's point of view). So the solution to mass surveillance of individuals by the state is mass surveillance by individuals of the state in operation. This would arguably increase transparency, but might ironically increase the chilling effect of surveillance of the populous - if what is being surveilled becomes more common knowledge. The 'transparent society' is an ahistorical view that looks solely at the disclosure of information without context. But it's the context that renders the meaning of the flow of information. That context is usually characterised - falsely - as two individuals who appear to have an equal stake and equal power in their relationship. I'm talking about Alice and Bob of course.

If you think about the broken relationship between Alice and Bob, which I blogged about on the JLINC blog here, privacy and transparency are efforts with opposite vectors. Alice wants or should have more agency to operate freely. To do that she needs the power and the technology to determine what data she gives to Bob for what purpose (User-Managed Access,  User Submitted Terms and Do Not Track are various ways of addressing this). Alice also needs the ability to hold Bob to account - something that contracts of adhesions (AKA web privacy statements or Terms of Use) are frequently specifically designed to avoid. The Consent Receipt is, among other things, a tool for accountability. It also provides a data format to enable both Bob and Alice to manage consent, or co-manage consent, which also addresses the information power imbalance. Broader possibilities to address the power imbalance include Don Marti's C.H.E.D.D.A.R. and the data provenance system enabled by the JLINC protocol. Some are all of these should be part of a privacy engineer's toolkit. 

A number of things are for sure:

1. There are no silver bullets.
2. There's more than one way to get from here to there.
3. A good solution today is better than a perfect solution in the future.
4. What might be a good solution today may not be effective tomorrow.

So if you're working to increase Alice's informational autonomy and/or are working to reduce Bob's control over Alice in the relationship, then I'm pretty sure we're in the same camp and I applaud your efforts. Let's come back in a hundred years to see if the abuse of personal data for profit is antithetical to anything we do as the abuse of child labour for profit. 

PS. The declarations and opinions above are my own. I get that some people have different definitions and assumptions, so please let's not get into a definitional war. 


John Wunderlich, BA, MBA

IAPP Fellow of Information Privacy
CISA, CIPM, CIPP/C, PbD Ambassador
@PrivacyCDN & Privacist

On 28 December 2016 at 09:52, Adrian Gropper < " target="_blank"> > wrote:
I'm not sure how many "camps" there are but anonymity (or inability to corrrelate transactions) is certainly high up on the list of potential camps. Digital cash in the form of a 'user pocket' or https://en.m.wikipedia.org/ wiki/Zcash could be considered a camp separate from transparency. It might be equal or opposite. It's an interesting question. 

If "we" were to approach our various efforts in terms of privacy engineering we would likely be more effective. As it is, my work on transparency, Tom's on anonymity, and the other named projects are just camps connected by the spirit of Doc.

I don't mean to denigrate anyone's work but we can do better.

Adrian

Many approaches are needed. I focus on advancing the idea for a 'user pocket' for certain online payments... (yes, the cash card design for a user-owned micropament network)... which because of the nature of the pocket needed.... creates a user account for small sums outside of one's banking system accounts. This pocket has import both for security for certain one-click online payments... but the ramifications of such a user-owned network.... as well as the user-owned bank(s) in which the cash card's 'pool" could be held... are more significant than they at first may appear. 

Frankly, I don't know that that idea belongs in either camp. I just think its a necessary tool and institution. Is there a group advancing or thinking about needed and missing public institutions in our corporatized, financialized brave new world?



On Wed, Dec 28, 2016 at 4:51 AM, Doc Searls < " target="_blank"> > wrote:
Love David Brin, his work and his points at that link. Also agree that transparency matters utterly.

But we also have a big “we,” and we can’t all focus on one thing at a time. So I just want to give full respect here to all the work going on in other areas. They aren’t just distractions for the people working on them, and their work matters.

Doc



Great find, Tom. Transparency is where it's at. Do Not Track, Consent Receipts, and My Terms are a distraction. We need to focus on transparency first.

Adrian








This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately by e-mail if you have received this e-mail by mistake and delete this e-mail from your system. If you are not the intended recipient you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.









--

Adrian Gropper MD

PROTECT YOUR FUTURE - RESTORE Health Privacy!
HELP us fight for the right to control personal health data.

DONATE: http://patientprivacyrights.org/donate-2/




Archive powered by MHonArc 2.6.19.