Text archives Help


Re: [projectvrm] A VRM/PDS dream come true :-)


Chronological Thread 
  • From: John Wunderlich < >
  • To: Peter Cranstone < >
  • Cc: Kevin Cox < >, Joyce Searls < >, Alan Mitchell < >, Matt Hogan < >, "Daniel Kaplan" < >, ProjectVRM list < >
  • Subject: Re: [projectvrm] A VRM/PDS dream come true :-)
  • Date: Sun, 7 Jul 2013 15:12:21 -0400

If I look at the cost model implicit in the diagram, it assumes a couple of things:

Low data storage costs
Low data processing costs
Low raw material (i.e. user data) costs

All of these costs are asymptotically approaching zero. It's not coincidental that "More Search/User Data" is at the centre of the diagram. What you are suggesting, it seems to me, is that the core change is not 'More', but instead 'Better'. I couldn't agree more (pun intended). I don't know, but I suspect, that search algorithms get more and more complex the more data is thrown at them because the signal to noise ratio keeps moving in the direction of more noise. VRM offers a way to turn up the gain on the signal side and reduce the noise, while enabling a better narrative to the user.

But trying to talk a data geek (be they marketer or medical researcher) into reducing their data set pre-emptively goes against the grain, even when you have data on your side. So, with respect to your Trusted Web Service Manager, it will be a good value proposition if it is integrated into the existing model and increases the relevance of results (the connection between a strong search product and increased users in the diagram) by REDUCING the search/user data set. And it will be Privacy by Design if it results in a more limited or consent driven data collection model. If the Trusted Web Service Manager is an 'externality' it becomes an added cost.

JW

On 2013-07-07, at 1:11 PM, Peter Cranstone < "> > wrote:

RE: If VRM is to work then innovation (AKA profitability) needs to be disconnected from monetizing personal data as an object and connected instead to a process of transparently monetizing expressed intent. 

Agreed. But how do you put a value on 'intent' when people want everything to be free?

When the subject of money and profits rears it's ugly head we get to see who can create real value that a consumer/vendor is willing pay for. Right now the value of privacy is tied to free services. So the corollary to that would be, the value of my privacy is now tied to a 'paid service' - simple yin and yang.

So that tells me that the notion of a 'Trusted Web Service Manager' which aligns the value of intent with vendors who wish to bid on it has serious merit. Because incumbent within that is the notion that both sides are paying for something.

Here's Google's 'Network Effect' and user metrics from 2010 – what someone needs to build is a realistic VRM Network Effect model – my bet is that it will all center around the Trusted Web Service Manager – something which will be just about impossible for Google to compete with as it breaks their model below.

<8D23F09A-FE50-461B-960C-8BD4B21EAA79.png>

Here's Google's user metric from 2010

<6376C98F-C373-4146-A240-F3F721A4E629.png> 






Kevin;

See my in-line responses below.

As a general comment I would argue that Internet commerce accelerates the tendency towards sectoral and industrial consolidation. This in turn leads to asymmetric power relations between the customers in their millions and the vendors in their dozens. Thus the necessity for regulatory intervention to ensure that the rules of the game establish more equal power relationships between service users  and service providers. This is how I see VRM, as a way of realizing a more equal power relationship.

{begin deliberately provocative suggestion} 

The use of evolutionary comparisons suggests that there is competition between various commercial entities (presumably the ones that are 'evolving') to pass down their successful genes, which would be business process and/or business models that generate competitive advantage. 

If the above comparison of evolution (biological genes = commercial processes) is the the case, maybe what we need to look at it is the tax code. Nothing impacts business process profitability more than taxes. As a thought experiment imagine what the impact on data collection would be if there were a tax on the amount of data collected from an individual regardless of consent. Not proposing, just saying' that such a revenue stream would a) give the state regulator both the income and the information it would need to enforce data protection regulations and b) motivate enterprises to be frugal in the collection of personally identifiable information.
 
{end of deliberately provocative suggestion}

The systems under which organizations currently operate have 'evolved' to enable profit maximization and are focussed on delivering shareholder value on a quarter by quarter basis using back end payments for front end services. This means that attempts to enable user control over their own data, or to put limits on the uses to which organizations put collected data, are castigated as inhibitions on innovation. If VRM is to work then innovation (AKA profitability) needs to be disconnected from monetizing personal data as an object and connected instead to a process of transparently monetizing expressed intent. 

JW

On 2013-07-05, at 12:53 PM, Kevin Cox < "> > wrote:

John,

Regulatory controls tell the players the rules of the game.  However, we know that there is always profit to be made in breaking the rules and getting away with it.
Regulators generally recognize this, but also have to move forward on the presumption that people organizations will follow the rules of the game. 

 In designing social systems that are able to evolve we must build in social mechanisms that make it highly likely that breaking the rules will be discovered and punished by exclusion. 
No sure that we can 'design' social systems, as the rule of unintended consequences too often creeps in.


Examples:

Working rules and social enforcement exist around spam. The rules are set in regulations around spam in many countries and the participants, including ISP's accept and enforce or support rules. In addition the vast majority of users work within the rules.

An example of nonworking rules would be anti-piracy rules. Breaking the rules is more the standard than the rules, and there is little effective punishment.


The difference between the two, or at least one difference, is in whose interests the rules were made. There is a common interest between the users and suppliers in the case of email (spammers are unwelcome interlopers, whereas there is a divergence of interests between the content providers and content consumers in the case of privacy) 
 


VRM works if the vendors can be confident in the identity of the customer, and if the customers can be assured of the identity of the vendors.  Vendors can refuse to deal with people they cannot trust - while customers will refuse to deal with vendors they cannot trust.  Trust is broken when the rules of commerce are broken.

VRM works and trust can be built, I would argue, when the vendors and customers have a common interest. There is a common interest when there is a commercial relationship between the two where one wants to buy what the other has to sell. Your argument about customers and vendors fails when it comes to 'free' services because customers are, in fact, the advertisers and data aggregators that pay for the services provided. The fact that there is a trusted relationship between service provider and advertiser is the problem. So for these free services, your argument suggest to me that there needs to be a three way transitive trust relationship - Advertisor/Aggregator <-> Service Provider <-> Service Consumer.


If you look at successful living systems then this principle is everywhere.  Darwin had it right when he talked about survival of the fittest.  Those who interpreted him by thinking that the fittest were the fittest in a competitive sense were wrong.  The fittest are those who best fitted into the environment and cooperated with other entities in the environment so that they were able to survive and pass on their genes.  
The best solution to passing on genes isn't always cooperation. Both symbiosis and parasitism work in evolutionary time. VRM is an argument for symbiosis (mutual benefit), but the current model is parasitic. 


Humans have made the evolutionary step of being able to create environments in which to live and evolve.  The environments we create that best survive are those where the forces of cooperation and the interests of the group outweigh the forces of competition and self interest.  To see how this works read "Super Cooperators - Altruism  Evolution, and why we need each other to succeed" by Nowak and Highfield."

In the world of commerce knowing who the other party is so you can refuse to deal with them is the reason why VRM enables self regulation through exclusion.  Stop cooperating and give individual entities the ability to stop dealing with the rule breakers enforces compliance.  Trying to enforce compliance of rules by force does not work in the interconnected world of commerce. Excluding rule breakers from a profitable participation does work.

Kevin










On Sat, Jul 6, 2013 at 1:09 AM, John Wunderlich < " target="_blank"> > wrote:
Kevin;
 
The devil is in the details. The assumption that I made, but did not state, was that if the data subject scatters their information across multiple silos but continues to want the convenience of a unified [free] service model to present their identity, then the same API's and infrastructure that provides the convenience to the data subject provides the opportunity for attackers to aggregate that data for their own purposes. This is the the problem that you addressed in your response.

Thanks,
JW


PS. What with constraints on my time I'm not always able to keep up with this list, so this may already have been covered. I kinda grok the technical side of what needs to happen for VRM to heave itself from concept to reality, but there is another side that I'm still unclear on. Once personally identifiable information has been released from a silo to a data processor, it requires both technical and non-technical controls to ensure that the data subject's privacy wishes are observed. It occurs to me that that the various forms of the GPL license cover this in a different venue (i.e. maintaining open code). Has there been discussion or examples of a Gnu Privacy License to attach to software and/or data stores that collect, use, and disclose personally identifiable information - to ensure that data will only be processed for purposes that have received user consent, for example. Again, apologies if this has been dealt with.




John,

Keeping data about ourselves in silos is the standard method of protecting privacy.  It is a good principle and works.

To do this we need to make it difficult for others to match and move data about us across silos.

In the scheme we are building silos will give us id numbers BUT the number will only be used and known by the silo. We will not even know the number.  This builds in privacy into the system because it becomes difficult to match data about us across silos.

The system will make it easy for information about us to be passed between organisations if it is passed "through us" and will make it difficult and in many cases illegal if it is passed without our knowledge.  Again this builds in privacy into the system.

To make this happen we need secure reliable ways to allow ourselves to be identified.

That is what we are doing and it will make it easy to implement VRM systems.

Kevin





On Fri, Jul 5, 2013 at 12:15 AM, John Wunderlich < " target="_blank"> > wrote:
Kevin;

Scattering your data across the web is also fundamentally flawed, it seems to me. It is a variant of 'security through obscurity' and relies upon the difficulty of connecting to, and linking, multiple sources. There should be a Moore's Law for re-identification and identifiability, because I suspect that it does become twice as easy to re-identify and link someone's personal information every eighteen months or so.



This shows that the idea of keeping all your own personal data in your own vault is fundamentally flawed - if you want to keep things private.  If you want privacy you want others to store information about you in their own data stores.  That way our information is scattered around the internet and to at it someone has to break into all those stores.  Of course we want to be able to access it when we need to but let others keep it and let them take responsibility for looking after it and keeping it from others.  This the system we have at the moment with one difference.  Today we do not have access to our own information.  Tomorrow we will and VRM will become a reality.

Kevin


On Thu, Jul 4, 2013 at 12:11 AM, Joyce Searls < " target="_blank"> > wrote:
Time to review the Onion video from 2011, that Doc used in several speeches at that time :)

http://www.youtube.com/watch?v=kVX7K4gAKas

J

On Jul 3, 2013, at 9:01 AM, Alan Mitchell < " target="_blank"> > wrote:

> I love the 'key partners' bit!
>
> A
>
>
> On Wed, Jul 3, 2013 at 1:37 PM, Matt Hogan < " target="_blank"> > wrote:
> We actually put out something similar a few weeks ago. I figured I'd share it with the group given the nature of the thread.
>
> http://getprsm.com/
>
>
> 2013/7/3 Daniel Kaplan < " target="_blank"> >
> We dreamt it, they made it happen. 'Nuff said:
> http://prism.andrevv.com/
>
> Cheers,
>
> Daniel
>
> --
>
>
> ----------------------------------------------------------------------------------------
> FING - association pour la Fondation Internet Nouvelle Génération
> The Next-Generation Internet Foundation
> Daniel Kaplan - " target="_blank"> - +33 6 8962 9968
> http://www.fing.org  /  http://www.internetactu.net
> Paris : nouvelle adresse - 8 passage brulon - 75012
> Marseille : CMCI, 2 rue Henri Barbusse - 13001
> ----------------------------------------------------------------------------------------
>
>
>
>
>
> --
> CEO/Founder
> DataCoup, Inc.
> Datacoup.com
> @matthewphogan
> @datacoup
> 415-533-7492
>
>
>
>
>
> --
> Alan Mitchell
> Strategy Director, Ctrl-Shift
> T6 3rd Floor West Wing
> Somerset House
> London, WC2R 1LA
>
> Mobile: +44(0)7711 899 784
> Skype: alansmitchell
>
> www.ctrl-shift.co.uk
> Twitter: 321Ctrlshift
>
>
>
>
>











Archive powered by MHonArc 2.6.19.