The Economics of Information Security: Difference between revisions

From Cybersecurity Wiki
Jump to navigation Jump to search
(New page: ==Full Title of Reference== ==The Economics of Information Security== ''Bluebook Style'' Ross Anderson and Tyler Moore, ''The Economics of Information Security,'' 314 Sci. 610 (2008). ...)
 
 
(31 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==Full Title of Reference==
==Full Title of Reference==
The Economics of Information Security


==The Economics of Information Security==
==Full Citation==


''Bluebook Style''
Ross Anderson and Tyler Moore, ''The Economics of Information Security,'' 314 Sci. 610 (2006). 
[http://people.seas.harvard.edu/~tmoore/science-econ.pdf  ''Web'']
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.89.3331&rep=rep1&type=pdf ''AltWeb'']


Ross Anderson and Tyler Moore, ''The Economics of Information Security,'' 314 Sci. 610 (2008).  [http://people.seas.harvard.edu/~tmoore/science-econ.pdf  ''Web'']
[http://cyber.law.harvard.edu/cybersecurity/Special:Bibliography?f=wikibiblio.bib&title=Special:Bibliography&view=detailed&action=&keyword=%40Article{Anderson_Moore:2006, ''BibTeX'']


[http://cyber.law.harvard.edu/cybersecurity/?title=Special:Bibliography&view=detailed&startkey=Anderson_Moore:2006&f=wikibiblio.bib ''BibTeX'']
==Categorization==


==Categorization==
* Issues: [[Economics of Cybersecurity]]; [[Incentives]]; [[Information Sharing/Disclosure]]; [[Insurance]]; [[Metrics]]; [[Privacy]]; [[Risk Management and Investment]]


Issues: [[Cyberwar]]
* Approaches: [[Regulation/Liability]]; [[Technology]]


==Key Words==  
==Key Words==  
 
[[Keyword_Index_and_Glossary_of_Core_Ideas#Computer_Network_Attack | Computer Network Attack]],
''See the article itself for any key words as a starting point''
[[Keyword_Index_and_Glossary_of_Core_Ideas#Cyber_Security_as_an_Externality | Cyber Security as an Externality]],
[[Keyword_Index_and_Glossary_of_Core_Ideas#Information_Asymmetries|  Information Asymmetries]],
[[Keyword_Index_and_Glossary_of_Core_Ideas#Interdependencies | Interdependencies]]


==Synopsis==
==Synopsis==


The economics of information security has recently become a thriving and fast-moving discipline. As distributed systems are assembled from machines belonging to principals with divergent interests, we find that incentives are becoming as important as technical design in achieving dependability. The new field provides valuable insights not just into "security" topics (such as bugs, spam, phishing, and law enforcement strategy) but into more general areas such as the design of peer-to-peer systems, the optimal balance of effort by programmers and testers, why privacy gets eroded, and the politics of digital rights management.
The economics of information security has recently become a thriving and fast-moving discipline. As distributed systems are assembled from machines belonging to principals with divergent interests, we find that incentives are becoming as important as technical design in achieving dependability. The new field provides valuable insights not just into "security" topics (such as bugs, spam, phishing, and law enforcement strategy) but into more general areas such as the design of peer-to-peer systems, the optimal balance of effort by programmers and testers, why privacy gets eroded, and the politics of digital rights management.
Over the past six years, people have realized that security failure is caused at least as often
by bad incentives as by bad design. Systems are particularly prone to failure when the
person guarding them is not the person who sufers when they fail. The growing use of
security mechanisms to enable one system user to exert power over another user, rather
than simply to exclude people who should not be users at all, introduces many strategic
and policy issues. The tools and concepts of game theory and microeconomic theory are
becoming just as important as the mathematics of cryptography to the security engineer.
The paper explores the overlap and gap between security practices and market practices. An example:
* Platform vendors commonly ignore security in the beginning, as they are building their market position; later, once they have captured a lucrative market, they add excessive security in order to lock their customers in tightly
The paper charts standard occurrences and outcomes for system security and reliability, and matches these up with game theory and microeconomic theory, to determine and rationalize security failings. Consider this scenario:
* Consider a medieval city. If the main threat is a siege, and each family is responsible for maintaining and guarding one stretch of the wall, then the city’s security will depend on the efforts of the laziest and most cowardly family. If, however, disputes are settled by single combat between champions, then its security depends on the strength and courage of its most valiant knight. But if wars are a matter of attrition, then it is the sum of all the citizens’ efforts that matters. System reliability is no different; it can depend on the sum of individual efforts, the minimum effort anyone makes, or the maximum effort anyone makes. Program correctness can depend on minimum effort (the most careless programmer introducing a vulnerability), whereas software validation and vulnerability testing might depend on the sum of everyone’s efforts.
What are the consequences?
* In the minimum-effort case, the agent with the lowest benefit-cost ratio dominates. As more agents are added, systems become increasingly reliable in the total-effort case but increasingly unreliable in the weakest-link case. What are the implications? One is that software companies should hire more software testers and fewer (but more competent) programmers.


==Additional Notes and Highlights==
==Additional Notes and Highlights==
 
Expertise Required: Economics - Low
'' * Outline key points of interest

Latest revision as of 14:53, 19 August 2010

Full Title of Reference

The Economics of Information Security

Full Citation

Ross Anderson and Tyler Moore, The Economics of Information Security, 314 Sci. 610 (2006). Web AltWeb

BibTeX

Categorization

Key Words

Computer Network Attack, Cyber Security as an Externality, Information Asymmetries, Interdependencies

Synopsis

The economics of information security has recently become a thriving and fast-moving discipline. As distributed systems are assembled from machines belonging to principals with divergent interests, we find that incentives are becoming as important as technical design in achieving dependability. The new field provides valuable insights not just into "security" topics (such as bugs, spam, phishing, and law enforcement strategy) but into more general areas such as the design of peer-to-peer systems, the optimal balance of effort by programmers and testers, why privacy gets eroded, and the politics of digital rights management.

Over the past six years, people have realized that security failure is caused at least as often by bad incentives as by bad design. Systems are particularly prone to failure when the person guarding them is not the person who sufers when they fail. The growing use of security mechanisms to enable one system user to exert power over another user, rather than simply to exclude people who should not be users at all, introduces many strategic and policy issues. The tools and concepts of game theory and microeconomic theory are becoming just as important as the mathematics of cryptography to the security engineer.

The paper explores the overlap and gap between security practices and market practices. An example:

  • Platform vendors commonly ignore security in the beginning, as they are building their market position; later, once they have captured a lucrative market, they add excessive security in order to lock their customers in tightly

The paper charts standard occurrences and outcomes for system security and reliability, and matches these up with game theory and microeconomic theory, to determine and rationalize security failings. Consider this scenario:

  • Consider a medieval city. If the main threat is a siege, and each family is responsible for maintaining and guarding one stretch of the wall, then the city’s security will depend on the efforts of the laziest and most cowardly family. If, however, disputes are settled by single combat between champions, then its security depends on the strength and courage of its most valiant knight. But if wars are a matter of attrition, then it is the sum of all the citizens’ efforts that matters. System reliability is no different; it can depend on the sum of individual efforts, the minimum effort anyone makes, or the maximum effort anyone makes. Program correctness can depend on minimum effort (the most careless programmer introducing a vulnerability), whereas software validation and vulnerability testing might depend on the sum of everyone’s efforts.

What are the consequences?

  • In the minimum-effort case, the agent with the lowest benefit-cost ratio dominates. As more agents are added, systems become increasingly reliable in the total-effort case but increasingly unreliable in the weakest-link case. What are the implications? One is that software companies should hire more software testers and fewer (but more competent) programmers.

Additional Notes and Highlights

Expertise Required: Economics - Low