We may be driving this toward a multi-layer trust system. At one layer, we actually talk about the implementation details behind “associates” and “secure email address” and open up details of the security architecture to review, pen testing and white-box scanning. At another layer, we provide the general public a way to verify that the review and testing are performed, drilling down into the details if desired, but in most cases looking for the assurance of a trust seal or reputation ranking. That said, I’m not entirely sure trust seals are workable in the current implementation. Over at Tresorit their privacy and cookie policy links to Zendesk’s privacy and cookie policy which displays the TRUSTe seal. While Tresorit don’t actually claim TRUSTe qualifications, the wording “our cookie page” and click path could easily mislead site members into thinking Tresorit qualified for TRUSTe status. But that’s a bit of a reach. What really bugs me about TRUSTe and a couple others is the number of sites I’ve run into that completely fail the OWASP password management and authentication guidelines and still qualify for the seal. Some combination of central trust seal and crowdsourced, reputation-based ranking of that seal on a per-site basis might improve the situation. If this were the case, then your conditions might be expanded to 3 which include: 1- A person creates a key-pair and saves the private key in a personal server, 2- The person associates the public key with a secure email address and posts the cert via DNS, 3- The provisioning operates within an ecosystem which provides ongoing assurance of the security architecture. I’d happily conclude that discussion with an enthusiastic “yes!” Then there would be a rigorous process conducted among specialists to provide the assurances alluded to in the high-level description. -- T.Rob From:
[mailto:
] On Behalf Of Adrian Gropper { T.Rob's answer draws heavily from the thread Patient ID Redux on http://lists.pde.cc/lists/arc/personal-clouds/2013-03/msg00241.html } I appreciate T.Rob reviving of this thread and thoughtful reply. His point about how a CA can add value reminds us of the ongoing tension between centralized and p2p architectures. I'm hoping to do better than conclude this thread with: "it depends". VRM is a very diverse endeavor and, even within the subset of VRM as applied to healthcare, there ought to be some principles that can inform our conversation. Although I understand that security professionals prefer to deal with specific cases and clear assumptions, to what extent can we come up with a Digital ID Bill of Rights or otherwise develop a consensus around:
Adrian On Fri, Apr 12, 2013 at 11:15 PM, T.Rob <
" target="_blank">
> wrote: Sorry to revive an old thread, I’m deeply queued at the moment. Question: what exactly is a “secure email address”? In an informal setting such as a list server or some blog posts, I’m as guilty as anyone of using the word “secure” as if it meant anything. On a professional engagement I insist my customers define actual requirements in terms of the security service to be provided (authenticity, integrity, privacy, etc.), the threat to be mitigated (for example protection against data exfiltration by theft of a laptop versus by remote compromise of a server in a datacenter require VERY different controls), the audit and forensic capabilities, and so forth. Second question: what is meant by “associates”? That the key is bound to an X.509 cert where the Distinguished Name is the email address? That the owner authenticates in person to register the key with their email address? Don’t answer these, by the way. I don’t want to derail this thread down that rabbit hole but am just pointing out the frame work within which that discussion would proceed. The answer to the question of whether this 2-step process is adequate depends on the definitions of “associates” and of “secure”. At a very high level it’s possible to answer “yes” but there’s a plethora of underlying assumptions required to arrive at that answer. Assuming ALL of them are fulfilled it’s possible but whether it’s practical requires driving out all the assumptions identifying threats, controls and mitigations, then looking for a point where the cost/benefit/risk balance to the positive. One example… is that DNS or DNSSEC? Because there are weaponized tools that can hack enough servers in the DNS network that one must treat it as broken and hostile in the design of a security architecture. One might stipulate that access always originates from within a closed network where DNS servers are known to be patched but that severely limits participation. One might provide tools to check the patch status of local DNS servers but that approach is both brittle and incomplete. One might specify DNSSEC but there’s very little of that available. Finally, one might stipulate that the system must work despite broken DNS and that implies requirements for additional mitigating controls somewhere else – such as in the “associates” phase. See where I’m going with this? The benefits of scale allow for the CA to invest heavily in technical and human process controls over the entire key lifecycle. This includes the cost of rigorous process controls and technology that provide for secure management of the key servers and revocation servers. Normally we think of the CA as a signer but they are also a registry and revocation provider and these are as important to the system as provisioning. The questions below must assume that *someone* is providing the human process rigor, registry and revocation services. If it’s a CA we know it isn’t great but if it’s not a CA we should assume it’s worse and again look at whether additional mitigating controls are required. Short answer – the question as posed cannot be answered with any degree of confidence. If you start with the question of “what constitutes digital identity” and worked backward you’d almost certainly end up with the elements mentioned in the question: a GUID, a (hopefully cryptographically strong) binding of the GUID to a real-world entity, and a registry component. However it doesn’t follow that these will be the only elements of the solution. -- T.Rob From:
" target="_blank">
[mailto:
" target="_blank">
] On Behalf Of Adrian Gropper Devon, Would you consider the following 2-step process an example of sovereignty as you see it: 1- A person creates a key-pair and saves the private key in a personal server, 2- The person associates the public key with a secure email address and posts the cert via DNS? Adrian
I am interested in those questions Adrian. I like what Drummond is saying here... specific to the "overall context of ownership and personal boundaries... THEN... relationship context..." But I do not think it is an artificial dichotomy. And I think this is very material to all of our perspectives here in this conversation. If we map personal sovereignty as a Human birth event, I believe that the dichotomy looks like an XY axis from the point of view of the sovereign Individual. Within the gift of a sovereign Human birth, personal security is a function of our root relationships which guarantee control of our personal freedoms as Individuals. In literal terms, the very American notion of personal sovereignty is to be guaranteed by a root "Guardian" relationship until an age of accountability wherein it is the root asset of an Individual's life to control the context of their choices. The whole process is a gift passed from one generation to the next... one that is "endowed by our Creator"... upon each of us equally...in Law. The problem is that we are mapping this process wrong currently, usurping personal sovereignty and replacing it with a form of administered sovereignty that has each of us recast as social liabilities rather than personal assets by default. Its literally un-American from the perspective of our founding intent... and ironically, it doesn't even work in Britain anymore. Personally sovereign data structures change the flow of accountability. Society is not formed and maintained in the King's/Federal center, it is originated and empowered at its democratic edges. DNA, currency, root identity, market structure, governing Rights, etc... each of these are provided integrity in the same way. If we break that formula, there is no other. Its the ghost in the machine... 1+1 must = 3... with predictable regularity we have "faith" in. Devon On Sun, Mar 31, 2013 at 1:28 PM, Drummond Reed <
" target="_blank">
> wrote: On Sun, Mar 31, 2013 at 8:34 AM, Adrian Gropper <
" target="_blank">
> wrote:
Adrian, that seems like an artificial dichotomy. IMHO the whole idea of personal clouds and personal channels is to establish an overall context of ownership and personal boundaries, and THEN establish that personal data sharing takes place in the context of each relationship (or group of relationships) individual has. =Drummond
-- |
Archive powered by MHonArc 2.6.19.