Text archives Help


Re: [projectvrm] #trust Measuring Trust


Chronological Thread 
  • From: Kevin Cox < >
  • To: Devon M T Loffreto < >
  • Cc: ProjectVRM list < >
  • Subject: Re: [projectvrm] #trust Measuring Trust
  • Date: Sat, 23 Mar 2013 09:25:24 +1100

Devon,

To invent and develop measures (or any system) the evolutionary approach works best. This approach is to start with a minimal system that does something useful, try it out then from the feedback from use start to change it or throw it away.

Any operational network will have built into it measures of trust.  The page rank of google is a measure of trust. The credit worthiness score used by credit providers is another.

Here is a possible measure of relationships that is given for purposes of illustration.  For each relationship with another entity there is a one zero measure as seen by either party and either party can report that they do not trust the relationship and give the reason.  There must be ways of restoring a trust relationship and there must be ways of removing the relationship entirely.

A measure of relationship trust for any entity is the ratio of entities who trust to those who do not trust multiplied by the number who trust. That is Trusted squared  /  (non trusted +1)

I expect that someone somewhere has made a list of different measures and where they are applied. If not it would seem to be an excellent project for some Information Technology Researcher.

Kevin



On Sat, Mar 23, 2013 at 6:20 AM, Devon M T Loffreto < " target="_blank"> > wrote:
Inline:

On Fri, Mar 22, 2013 at 2:44 PM, Kevin Cox < " target="_blank"> > wrote:
Devon,

We need measures to help us make judgements on the degree of risk on each new interaction with another party.
Okay. So can we define "measures" yet?
And...
"judgements on the degree of risk" + "on each new interaction with another party"... how are we providing context to our measurement there? Interaction with another party requires an encapsulated view for integrity to exist in the exchange. 
For instance, if I am operating as a parent advocating on behalf of my child in an online interaction with a health care service provider... versus... I am operating as a potential customer online with 3-4 different auto insurance vendors... versus... I am operating online as a vendor selling developmental toys for families of special needs children... versus... I am building local community relationships online as a contractor doing masonry work... all within a 20 minute period of time... how do we measure trust, taking into account the strategic value of positioning oneself to provide leverage in any given interaction according to the personal goals that I, as an Individual, set up for these exchanges? I know there can not be one trust model that fits each of those exchanges, and the fact that I play those various roles should not affect my trust value in any specific transaction. So how is trust measured to provide an accurate view on the degree of risk in a future interaction with another party given past activities and responses that exist in a context?
 
 The word trust is probably the wrong word. Perhaps a better word would be risk or perhaps we should invent a new word.  
I agree.
 
For the time being let us qualify the meaning by calling it a Measure of Trust, or  trust for short, but understand that it is not the same as human trust. We can then invent measures of this restricted form of trust and find if the measures reflect the risk of our collective interactions with each other.

So I think my above scenario fits here. If we call it a measure of trust... and we apply it to our "collective interactions", how does the measurement have any integrity against the very real world consideration of strategic communication? It feels like we are trying to dumb the world down so that it is measurable in terms of trust, which is no longer Human, but exists as an infrastructure where online interactions can be examined for risk. 

Kevin




On Sat, Mar 23, 2013 at 3:28 AM, Devon M T Loffreto < " target="_blank"> > wrote:
Hmm... interesting... we have different views on how those words are translated. I can certainly see your point-of-view... but I do not necessarily agree with how those words render the outcomes you suggest. Obviously, given the context of our involvement here on this list, neither of us are seeking the form of accountability or trust that renders Big Brother surveillance empowerment at the expense of Individual liberty empowerment. It ultimately is about how balance is achieved and best managed.

So, the supposition that accountability is about giving power to the strong to control the behavior of the weak... and trust is about giving people the power to effectively moderate behavior... this would be the area where implementation drives meaning.

I do not think that accountability has to mean what you assert, or put power in a context of weak versus strong.. I am not averse to learning that trust can be a mechanism for producing what you hope.

Either way, the proof will be in the implementation. Skepticism is warranted, and even necessary, in either direction I suppose.

Devon


On Fri, Mar 22, 2013 at 11:19 AM, Joe Andrieu < " target="_blank"> > wrote:
Devon, your own words belie your argument. If trust can't be programmed in how can accountability?

I'm a big fan of accountability when it matters, but without trust, we have no freedoms. A system of perfect accountability is also one of complete control, and zero freedom. Big Brother is the icon of surveillance-based accountability. And we have way too much of that in our world. Applying that same thinking to vendors doesn't seem to help as much as it propagates the culture of patronizing control as a response to fear and mistrust.

We haven't yet established appropriate behavior. We need a consensus on that before we start locking things down. Strict enforcement of the unreasonable is what the callous dictator gets away with. We need to define what's reasonable and then build appropriate consequences to reinforce reasonable behavior.

Some of that will be about accountability, but I certainly hope it can also be an exercise in people and companies learning to behave themselves and not just a control framework for the powerful to make the weak do as they wish.


On Mar 22, 2013, at 7:44 AM, Devon M T Loffreto < " target="_blank"> > wrote:

Kevin,

That is an interesting model and theory... and I do not use those words with any disrespect intended.
"Trust" and "being trusted" or even further being "trustworthy" are not hard structural concerns. They involve "feelings" that are relative to Individual Humans.

The idea behind the use of "trust" online is interesting... and worthy of being instigated, challenged and demonstrated.
But there is a counter concept of "zero trust"... Eve Maler did a great presentation of this concept at an IIW a couple sessions back.

Accountability is a more meaningful concept in hard structural Terms than "trust" is... to me at least... in my experience.
I would prefer to operate in a "zero trust" infrastructure with accountability hard structured into participating roles. It does not matter if you are a vendor or a customer when your relationship to the infrastructure is equalized, because accountability can be processed equally. But in order to get there, we need to challenge the assumed model of the status quo where we create these liability models whereby *owners* and *employees* are given "liability control" within an exchange system in desperate need of more accurate accountability models than these administrative structures allow... and are different inherently than the control afforded customer roles.

"Trust" to me seems like a hack on the real problem that is not being confronted directly in its solution, and thus does not get us to an operating model that is accurate enough in the production of accountability... which is the real goal.

The market is irrational in so many ways... how can "trust" in an irrational system produce accountability with any significant degree of accuracy that is not dependent on top-down enforce-ability?

I am interested in keeping trust Human. I believe in trust. What I am not yet convinced of is the notion that it can be mapped to a technical infrastructure. I am hesitant of empowering the use of words that create big brother-esque double speak therein.

Devon Loffreto 


On Thu, Mar 21, 2013 at 6:25 PM, Kevin Cox < " target="_blank"> > wrote:
John,

On Fri, Mar 22, 2013 at 8:47 AM, John S James < " target="_blank"> > wrote:
The key word is "trusted." That's a central problem and issue on the Web.



We need to have some way of measuring trust.

When a relationship is established we work on the assumption that the other side can be trusted.  We specify the rules of the relationship.  If those rules are broken by one party or the other then the trust relationship between the parties is broken.  

If, when we go into a relationship, we agree that any break down will be publicly reported along with the parties and the reason for the breakdown then we can establish a measure of trust for entities by the ratio of trusted relationships to broken relationships.

Something along these lines can evolve if we have equal customer/vendor relationships.

We can achieve it incrementally.  When we implement an electronic vendor/customer relationship we can also specify what happens when the relationship breaks.  If we do enough of these and give ways of reporting the breaks then a trust measure will emerge.

Kevin
 








Archive powered by MHonArc 2.6.19.