John,
Lets tweak this a little – it's not about reducing the input data set – it's about adding more precision to the input data set. We're drowning in data, refined data with 'intent' will float to the service (pun intended)
RE:
If a reduced data set is collected, but is done so with consent and information about intent from the user, then the search results may well be more relevant to that user.
If a reduced data set is collected AND contains MORE precision then it can't be any worse that it is right now. In fact it may be better. And we come back full circle to core problem – how do we convey intent when we're ready? I was just re-reading Katherine's
email… and I quote…
Until the service builds a reputation of trust and respect, I think it is best to be able to say that NOTHING will be shared with anyone unless or until you opt-in. And you can do this in an affiliate shopping network
without weakening performance.
Right there we're all out of alignment. Katherine doesn't want to share her intent until she's ready to buy. Search has NO provision for this – it assumes that you want results and I'm automatically opting-in when I hit the enter button on my request.
My instinct tells me a lot of your assumptions are correct. I also believe what Katherine is saying. That's why I come back to the 'how' all the time. Until I can chose to opt-in then my privacy and intent are joined at the hip. There's simply no way to separate
them. Google et al is not going to bother with VRM if it destroys an existing revenue model or the design that underpins their foundation.
This is why I suggested adding HTTPR:// - this sends a message to the vendor/service that I'm reading to share my intent and my private data. All the consumer does is click on the 'R' button and the page is refreshed with a new URL – HTTPR. The consumer now
gets a visual confirmation that a relationship has been created or re-joined. Consumers can understand this – developers can understand it – and if that request now allows access via XDI or some other protocol to a reduced, yet refined data set, then the value
goes up which is something that the vendor understands.
Peter
_________________________
Peter J. Cranstone
CEO. 3PMobile
Boulder,
CO USA

Improving the Mobile Web Experience
Cell:
720.663.1752
From: John Wunderlich <
">
>
Date: Sunday, July 7, 2013 8:06 PM
To: "Peter J. Cranstone" <
">
>
Cc: Kevin Cox <
">
>, Joyce Searls <
">
>, Alan Mitchell <
">
>,
Matt Hogan <
">
>, Daniel Kaplan <
">
>, ProjectVRM list <
">
>
Subject: Re: [projectvrm] A VRM/PDS dream come true :-)
Search is not free. It's expensive to develop and maintain and as more and more data is fed into the hopper it doesn't get easier to turn the crank and produce meaningful results. That's why reducing the input data set is not an oxymoron, when it comes
to user intent and interest. If a reduced data set is collected, but is done so with consent and information about intent from the user, then the search results may well be more relevant to that user.
ABSOLUTELY AGREE with the building and shipping option vs talking option, reality will tend to render discussions moot, and I look forward to being proved wrong on many of my assumptions.
On 2013-07-07, at 4:23 PM, Peter Cranstone <
">
> wrote:
RE;
So, with respect to your Trusted Web Service Manager, it will be a good value proposition if it is integrated into the existing model and increases the relevance of results (the connection between a strong search product
and increased users in the diagram) by REDUCING the search/user data set. And it will be Privacy by Design if it results in a more limited or consent driven data collection model.
I would offer that the above is an oxymoron. Search is free, but uses your data to pay for the costs. So if you reduce or degrade the data-set the profits of the search company's go down. Shareholders
don't like that.
RE:
If the Trusted Web Service Manager is an 'externality' it becomes an added cost.
Yep. And that's the whole point – your intent now has value against a far less ambiguous data model, ergo value is created and can be extracted.
VRM by design cannot be integrated into something that has ambiguous data sets – the only solution is to separate it – that's why I suggested HTTPR:// - it's a new layer that can be OPEN to drive
adoption, yet when integrated into a service removes ambiguous data and replaces it with more valuable intent data. It really isn't that hard to build this, all the components are already there – we added XDI support in 48 hours. We could do the same with
HTTPR://.
But what I'm seeing constantly is a 'lets keep talking about it', along with 'lets have another meeting about it'. At some point there's nothing left to talk about – lets just start building and
shipping a solution. The alternative is that VRM becomes a footnote.
Peter
_________________________
Peter J. Cranstone
CEO. 3PMobile
Boulder,
CO USA
<0D794EC4-6F22-47F8-B3B5-24F9943D3965[1].png>
Improving the Mobile Web Experience
Cell:
720.663.1752
From: John Wunderlich <
">
>
Date: Sunday, July 7, 2013 1:12 PM
To: "Peter J. Cranstone" <
">
>
Cc: Kevin Cox <
">
>, Joyce Searls <
">
>, Alan Mitchell <
">
>,
Matt Hogan <
">
>, Daniel Kaplan <
">
>, ProjectVRM list <
">
>
Subject: Re: [projectvrm] A VRM/PDS dream come true :-)
If I look at the cost model implicit in the diagram, it assumes a couple of things:
Low data storage costs
Low data processing costs
Low raw material (i.e. user data) costs
All of these costs are asymptotically approaching zero. It's not coincidental that "More Search/User Data" is at the centre of the diagram. What you are suggesting, it seems to me, is that the core change is not 'More', but instead 'Better'. I couldn't
agree more (pun intended). I don't know, but I suspect, that search algorithms get more and more complex the more data is thrown at them because the signal to noise ratio keeps moving in the direction of more noise. VRM offers a way to turn up the gain on
the signal side and reduce the noise, while enabling a better narrative to the user.
But trying to talk a data geek (be they marketer or medical researcher) into reducing their data set pre-emptively goes against the grain, even when you have data on your side. So, with respect to your Trusted Web Service Manager, it will be a good value
proposition if it is integrated into the existing model and increases the relevance of results (the connection between a strong search product and increased users in the diagram) by REDUCING the search/user data set. And it will be Privacy by Design if it
results in a more limited or consent driven data collection model. If the Trusted Web Service Manager is an 'externality' it becomes an added cost.
JW
On 2013-07-07, at 1:11 PM, Peter Cranstone <
">
> wrote:
RE: If VRM is to work then innovation (AKA profitability) needs to be disconnected from monetizing personal data as an object and connected instead to a process of transparently
monetizing expressed intent.
Agreed. But how do you put a value on 'intent' when people want everything to be free?
When the subject of money and profits rears it's ugly head we get to see who can create real value that a consumer/vendor is willing pay for. Right now the value of privacy is tied to free services. So the corollary to that would be, the value of my privacy
is now tied to a 'paid service' - simple yin and yang.
So that tells me that the notion of a 'Trusted Web Service Manager' which aligns the value of intent with vendors who wish to bid on it has serious merit. Because incumbent within that is the notion that both sides are paying for something.
Here's Google's 'Network Effect' and user metrics from 2010 – what someone needs to build is a realistic VRM Network Effect model – my bet is that it will all center around the Trusted Web Service Manager – something which will be just about impossible
for Google to compete with as it breaks their model below.
<8D23F09A-FE50-461B-960C-8BD4B21EAA79.png>
Here's Google's user metric from 2010
<6376C98F-C373-4146-A240-F3F721A4E629.png>
Peter
_________________________
Peter J. Cranstone
CEO. 3PMobile
Boulder,
CO USA
<0D794EC4-6F22-47F8-B3B5-24F9943D3965[3].png>
Improving the Mobile Web Experience
Cell:
720.663.1752
From: John Wunderlich <
">
>
Date: Sunday, July 7, 2013 11:01 AM
To: Kevin Cox <
">
>
Cc: Joyce Searls <
">
>, Alan Mitchell <
">
>, Matt Hogan <
">
>,
Daniel Kaplan <
">
>, ProjectVRM list <
">
>
Subject: Re: [projectvrm] A VRM/PDS dream come true :-)
Kevin;
See my in-line responses below.
As a general comment I would argue that Internet commerce accelerates the tendency towards sectoral and industrial consolidation. This in turn leads to asymmetric power relations between the customers in their millions and the vendors in their dozens.
Thus the necessity for regulatory intervention to ensure that the rules of the game establish more equal power relationships between service users and service providers. This is how I see VRM, as a way of realizing a more equal power relationship.
{begin deliberately provocative suggestion}
The use of evolutionary comparisons suggests that there is competition between various commercial entities (presumably the ones that are 'evolving') to pass down their successful genes, which would be business process and/or business models that generate
competitive advantage.
If the above comparison of evolution (biological genes = commercial processes) is the the case, maybe what we need to look at it is the tax code. Nothing impacts business process profitability more than taxes. As a thought experiment imagine what the impact
on data collection would be if there were a tax on the amount of data collected from an individual regardless of consent. Not proposing, just saying' that such a revenue stream would a) give the state regulator both the income and the information it would
need to enforce data protection regulations and b) motivate enterprises to be frugal in the collection of personally identifiable information.
{end of deliberately provocative suggestion}
The systems under which organizations currently operate have 'evolved' to enable profit maximization and are focussed on delivering shareholder value on a quarter by quarter basis using back end payments for front end services. This means that attempts
to enable user control over their own data, or to put limits on the uses to which organizations put collected data, are castigated as inhibitions on innovation. If VRM is to work then innovation (AKA profitability) needs to be disconnected from monetizing
personal data as an object and connected instead to a process of transparently monetizing expressed intent.
JW
On 2013-07-05, at 12:53 PM, Kevin Cox <
">
> wrote:
John,
Regulatory controls tell the players the rules of the game. However, we know that there is always profit to be made in breaking the rules and getting away with it.
Regulators generally recognize this, but also have to move forward on the presumption that people organizations will follow the rules of the game.
In designing social systems that are able to evolve we must build in social mechanisms that make it highly likely that breaking the rules will be discovered and punished by exclusion.
No sure that we can 'design' social systems, as the rule of unintended consequences too often creeps in.
Examples:
Working rules and social enforcement exist around spam. The rules are set in regulations around spam in many countries and the participants, including ISP's accept and enforce or support rules. In addition the vast majority of users work within the rules.
An example of nonworking rules would be anti-piracy rules. Breaking the rules is more the standard than the rules, and there is little effective punishment.
The difference between the two, or at least one difference, is in whose interests the rules were made. There is a common interest between the users and suppliers in the case of email (spammers are unwelcome interlopers, whereas there is a divergence of
interests between the content providers and content consumers in the case of privacy)
VRM works if the vendors can be confident in the identity of the customer, and if the customers can be assured of the identity of the vendors. Vendors can refuse to deal with people they cannot trust - while customers will refuse to deal with
vendors they cannot trust. Trust is broken when the rules of commerce are broken.
VRM works and trust can be built, I would argue, when the vendors and customers have a common interest. There is a common interest when there is a commercial relationship between the two where one wants to buy what the other has to sell. Your argument
about customers and vendors fails when it comes to 'free' services because customers are, in fact, the advertisers and data aggregators that pay for the services provided. The fact that there is a trusted relationship between service provider and advertiser
is the problem. So for these free services, your argument suggest to me that there needs to be a three way transitive trust relationship - Advertisor/Aggregator <-> Service Provider <-> Service Consumer.
If you look at successful living systems then this principle is everywhere. Darwin had it right when he talked about survival of the fittest. Those who interpreted him by thinking that the fittest were the fittest in a competitive sense were
wrong. The fittest are those who best fitted into the environment and cooperated with other entities in the environment so that they were able to survive and pass on their genes.
The best solution to passing on genes isn't always cooperation. Both symbiosis and parasitism work in evolutionary time. VRM is an argument for symbiosis (mutual benefit), but the current model is parasitic.
Humans have made the evolutionary step of being able to create environments in which to live and evolve. The environments we create that best survive are those where the forces of cooperation and the interests of the group outweigh the forces
of competition and self interest. To see how this works read "Super Cooperators - Altruism Evolution, and why we need each other to succeed" by Nowak and Highfield."
In the world of commerce knowing who the other party is so you can refuse to deal with them is the reason why VRM enables self regulation through exclusion. Stop cooperating and give individual entities the ability to stop dealing with the rule
breakers enforces compliance. Trying to enforce compliance of rules by force does not work in the interconnected world of commerce. Excluding rule breakers from a profitable participation does work.
Kevin
|