I thought this went out last night, but it didn't go. Forgive the tardiness, next to the subsequent posts in the thread...
> On Jan 6, 2015, at 4:54 AM, Crosbie Fitch < "> > wrote:
>
>> From: Brian Behlendorf
>> Crosbie, I think people here respond poorly to your emails (when they
>> bother to respond at all) for two reasons:
>
> Thanks for bothering to respond - much appreciated. :)
It's not a bother, at least for me. :-)
>> 1) It doesn't come across like you're here for a conversation, or to build
>
>> something, or help someone else build something.
>
> I'm here, since 2007, in a last ditch attempt to see if there's any point in
> further conversation.
>
> Cluetrain strongly resonated with me.
Cool. Good to hear. (Though I knew that.)
> VRM is a fundamentally sound proposition, and when implemented (if ever)
> will be revolutionary.
That too.
> I'm beginning to realise I don't have time for an apparent majority of
> participants
Who are they? (Feel free to tell me offline, if you'd rather not here.)
> to eventually learn from their mistakes, and my gentle nudges
> to avert the necessity of such learning ain't cutting it.
Not sure your nudges are gentle, but that's a judgement call.
>> You're here to give us a
>> lecture, like a stern professor who is disappointed when the class dares
>> to think about what's possible
>
> No, when the class embark upon the misadventure of pursuing the impossible.
>
> VRM need only facilitate the ability of the individual to communicate and
> trade.
Good summary. We've said from the start that VRM is about independence and engagement. I believe the independence side is in alignment with what you've been professing.
> Unfortunately, optimism has overflowed into a naïve presumption of being
> able to create new powers (unnecessary ones, even if they weren't
> impossible) and a fruitless pursuit thereof.
Can you be specific about what new powers are being proposed?
> And yes, as you observe, the misadventure is 'DRM for personal data'.
>
>> Or just to win an argument rather than share and enlighten.
>
> It is difficult to argue with those with unshakeable faith in the
> impossible, that I can be told Fred's favourite colour and be subsequently
> constrained by Fred (or some system) as to who I can or cannot disclose it
> to, moreover that even those to whom I am authorised to disclose it, are
> similarly constrained.
>
> Computers are great things, but optimism, sloppy thinking, and poor
> education are combining to create a cargo cult of the credulous that
> believe computers can do anything (limited only by one's imagination - as
> opposed to the laws of nature).
>
>
>> Likewise,
>> the rise of "the right to be forgotten" brings along a lot of thorny
>> questions and edge cases which scream for first principles - and the first
>
>> principle that if I possess information in my personal hands/devices
>> there's nothing you should be able to do to delete it, is very persuasive.
>
> If the understanding of what a right was hadn't been forgotten people would
> readily recognise that there is no such thing as a 'right to be forgotten'.
I agree, and I suspect many others here (we are ~ 600 in all, most of whom lurk) would also agree.
> This is simply a reframing of censorship - insinuated as a right of the
> individual in order to make it popular (everyone likes new powers), but with
> an ulterior motive of establishing Orwell's 'memory hole' censorship.
>
> The legislature cannot create rights.
It can acknowledge them. e.g. the Bill of Rights. Some in that Bill refer to "the people" and others to people generally, meaning individuals. All are arguable, and were produced after much argument; but my point here is in agreement with yours. Better to acknowledge what clearly stands as a natural right rather than legislate toward new ones not clearly rooted in nature. ("The right to be left alone" is among those, I think we would both say.)
> It instead annuls the corresponding natural right from the majority - to memorise and keep notes. The right may be annulled in law, but this still does not affect the individual's natural rights, their memory or liberty to keep notes.
>
> All that ends up happening is that search engines are degraded to return
> only the results the state will tolerate (for purposes of national security
> and the interests of the powerful - corporations).
I don't doubt that "the right to be left alone" degrades some search results, but I believe the law was not passed on behest of those. Many people feel overexposed and intruded upon, due to exposure of personal data on the Net. I believe the law tries to address that. But I'm not defending that law here; just saying that it's not just about the interests of the powerful. Laws often do protect the powerless. Or at least try.
>> Some better solutions, like strong crypto properly implemented, will solve a problem with mathematical rigor and provable security, so strongly you can bet your life on it.
>
> Unfortunately, in practice this is only available to adept mathematicians.
So far. One company whose employees are on this list (T.Rob is one of them and I'm on its board) is the invention of an adept mathematician and is building an SDK that makes possible the building of strong end-to-end crypto in apps. Far as I know, neither the SDK nor the imagined apps look toward control of what happens outside the bounds of encrypted channels.
>> But many solutions unavoidably will be mixtures of software, policy,
>> and business violations are not impossible but are discoverable and correctable, and where the costs/risks will be tolerable for the value created. These are likely worth doing, if the alternative is nothing.
>
> VRM facilities can be produced entirely with software alone.
Agreed. Though I wouldn't take hardware off the table. (DRM'd hardware I would.)
> A sign that things have gone astray is a claim that something else is
> needed, e.g. a law.
Has there been a suggestion here of that?
FWIW, I've always avoided policy discussion, and policy itself. This puts me at odds, to some degree, with my academic colleagues. Broad characterizations:
Businessperson: "The answer is money. What is the question?"
Academic: "The answer is policy. What is the question?"
I am interested, however, in creating ways of expressing personal preferences or terms regarding collection and use of personal data — to which both parties involved can easily agree, preferably in standard, normalized, automated ways. Do you have a problem with that? For example, are you suggesting that a term, or a preference, that says, "don't track me out of your site," agreed to by the other party, violates a natural law?
> Thus DRM couldn't work, and it still couldn't work even when the DMCA law
> was made to say DRM did work, and that people who realised it didn't work
> could be prosecuted.
>
> VRM should attempt to make it easier for people to communicate, to trade, to
> collect and share data as a consequence of their trades, not to make it
> harder.
Monstrous sums of data are being collected today without awareness of the parties its being collected from, outside of the context of trade. To you is it okay or not okay for an individual to say "I'd like you to agree not to do that?" Or hell, just to express a preference that the other party not do that, and hope that the other party has the good manners to agree? (This is all Do Not Track does, by the way. It's about manners, not constraint.)
> Unfortunately, it looks like the latter is the more popular direction.
I don't see it, but maybe I'm missing something.
Doc
Archive powered by MHonArc 2.6.19.