It's the same as the business model for education. People have to be educated to shun surveillance capitalism for the sake of democracy and our children's future. People have to be given the opportunity to connect to the internet without censorship or rent seeking infrastructure providers. When will people start to be embarrassed to be using Facebook?AdrianOn Sat, Feb 16, 2019 at 3:58 PM Elizabeth Maria < " target="_blank"> > wrote:(In transit but I have to jump in).I think some laws/regulations are aimed at this - it’s at the heart of data portability and access rights as well as data protection by design and default requirements. The problem is how long it takes those laws and their impact to be known/felt.Also very few lawyers have the incentive to argue for more individual agency since they’re not representing individuals or common interest groups but most potent legal resources are going to the commercial interests that make lawyers the most money. It’s beyond frustrating to me. We need more lawyers looking out for the public interest. But even here in DC, the vast majority of knowledge lawyers I know (who get it) are working for policy shops/think tanks that take money from big tech. There is so much capture.It’s also circular. I love Katherine’s point about rewarding those who are doing the right thing/doing good. When that becomes a good business model, it will attract more legal resources.Sent from my iPhoneOn Feb 16, 2019, at 3:33 PM, Adrian Gropper < " target="_blank"> > wrote:+1 Katherine.What stands out to me is the mention of information asymmetry in Savage's conclusion. Fixing this asymmetry will require technological agency and regulations that support or even require technological agency. I have not read the full Savage paper but I have yet to see any legal or regulatory scholar address what seems obvious to me and those of us involved in the self-sovereign movement. Without technological agency, information symmetry is just something that others (public or private) grant us when it suits them.Where is the learned description of the technological commons for individual humans?It's our job to describe and equip that. In fact, I need to be speaking about it at the Ostrom Workshop next Fall (as will Brett), and it'll be great if we have a lot to talk about.DocAdrianOn Sat, Feb 16, 2019 at 3:21 PM katherine < " target="_blank"> > wrote:When business income goes up, government revenues go up. That’s why no one cares about the costs of the means to that end.Some business’s incomes go up and have high employee retention, invest in personal development, bottom-up communication channels, etc. - they improve their human capital.Others with growing income have high employee suicide rates, divorce rates, drug and alcohol addiction - their human capital isn’t improving, it is burned.Costs of burning human capital are often born by society and ultimately the government. But governments don’t measure these costs or see a difference in the income from a company that burns human capital vs. those that grow human capital.Companies who exploit personal data to make money, claim they are doing the same thing as any other business which adds value to commodities.The government doesn’t see the difference between turning a cocoa bean into chocolate and turning data into a “target.”But there is a big difference. We’re just starting to learn the costs of treating humanity like a commodity to both society and ultimately the government.If accounting standards measured human capital growth and depreciation the way financial capital is measured, governments could tax companies to recover the costs for burning human capital. That would de-incentivize companies who exploit personal data to make money as well as other risks to human capital. It would also give the companies who do the right thing a level playing field so market forces may work again.Katherine Warman KernChristopher Savage is an alpha telecom attorney in DC whose name at times has been floated as a possible candidate for FCC Chairman. I've known Chris since the middle of the last decade, and have admired his creative, tough, good-humored, sensible and engaging approaches to pretty much everything.A paper he just published through Stanford Law School, Managing the Ambient Trust Commons: The Economics of Online Consumer Information Privacy, footnotes some of my writings and ProjectVRM. It more extensively cites the work of Elinor Ostrom, who was awarded a Nobel Prize in Economics for her groundbreaking work on the commons Were there no Elinor Ostrom, there might have been no Creative Commons, Customer Commons, or Ostrom Workshop at Indiana University, where at least two of us here (Brett Frischmann and myself) also have involvements.Here's the abstract of Chris' paper:Privacy interests arise from relationships of trust: people share information with those they trust and conceal things from those they don’t. Trust grows when it is respected and diminishes if it is betrayed. Firms in the online ecosystem need consumers to trust them, so the consumers keep coming online, being surveilled, viewing ads, and buying things. But those same entities make money by exploiting consumer trust—using the information they gain to develop individualized profiles that facilitate advertising that gets people to buy things they may not really want or need, at individualized rather than generally available prices. Trust and, thus, privacy, is therefore best viewed as a common-pool resource for the online ecosystem to manage, not as a commodity exchanged in a market between consumers and sellers. The common-pool resource model explains why online entities have incomprehensible privacy policies, why they accept regulation by the Federal Trade Commission, and why they recognize the seriousness of data breaches even as they reject any obligation to compensate consumers when a breach occurs. This model also clarifies the nature of the ongoing economic and political conflict between consumers and online entities about pervasive surveillance and the use of targeted ads. Market-based models, by contrast, do not fit these realities and, as a result, there is no reason to think that “market forces” will optimally equilibrate consumer and seller interests. Some modest regulatory correctives are therefore advisable.And here's the Conclusion:Market forces do not protect consumer privacy interests in the online economic ecosystem. Instead, this ecosystem is best conceived as a commons, in which consumer trust (from which privacy interests arise) is the managed common-pool resource—with online entities, aided by the FTC, acting as the commons managers. The choice of model matters. In the market model of privacy, whatever terms of engagement emerge between consumers and online entities regarding privacy and surveillance come with at least a weak presumption of optimality—that is, that they reflect a fair balancing of consumer and seller interests. In the commons model, however, there is no reason to think that the interests of consumers (whose level of trust is the common-pool resource) are being optimally balanced against those of sellers. Again, the economic objective of the commons managers is not to protect privacy; it is to surveil consumers, and use the data thus gleaned to make it easier to sell things—many of which, of course, consumers want, but others of which they would do better to do without. In this situation, there are some genuine and ongoing conflicts between consumers and the online ecosystem in which consumers increasingly spend time and money, conflicts that we cannot expect market forces to fairly equilibrate or optimize. In light of all this, we should consider some modest public education and regulatory efforts, outlined above. These proposals would begin both to address the information asymmetry problems and to empower consumers to enjoy online content, and transact business online, without having to sacrifice undue amounts of their privacy or their money.The paper is 67 pages and 33,475 words long, so basically it's a book.I look forward to reading and discussing it. If there is interest here, I can also invite Chris in to participate, or at least see if he's game.Doc
--Adrian Gropper MD
PROTECT YOUR FUTURE - RESTORE Health Privacy!
HELP us fight for the right to control personal health data.
--Adrian Gropper MD
PROTECT YOUR FUTURE - RESTORE Health Privacy!
HELP us fight for the right to control personal health data.
Archive powered by MHonArc 2.6.19.