[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[dvd-discuss] Text from CBDTPA = Intel TCPA Paper




(I have snipped sections 2 and 3, leaving in only enough of
Anderson's main thesis to see what he's driving at, and
drawing attention more directly to his analysis regarding
Intel's TCPA and the Hollings Bill.  -- Seth)


> http://www.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf

Security in Open versus Closed Systems
The Dance of Boltzmann, Coase and Moore 

Ross Anderson 

Cambridge University, England

mailto: rja14@cl.cam.ac.uk 


Abstract. Some members of the open­source and free software
community argue that their code is more secure, because
vulnerabilities are  easier for users to find and fix.
Meanwhile the proprietary vendor community maintains that
access to source code rather makes things easier  for the
attackers. In this paper, I argue that this is the wrong way
to  approach the interaction between security and the
openness of design. I  show first that under quite
reasonable assumptions the security assurance problem scales
in such a way that making it either easier, or harder,  to
find attacks, will help attackers and defendants equally.
This model  may help us focus on and understand those cases
where some asymmetry  is introduced. 

However, there are more pressing security problems for the
open source  community. The interaction between security and
openness is entangled  with attempts to use security
mechanisms for commercial advantage --  to entrench
monopolies, to control copyright, and above all to control 
interoperability. As an example, I will discuss TCPA, a
recent initiative  by Intel and others to build DRM
technology into the PC platform. Although advertised as
providing increased information security for users,  it
appears to have more to do with providing commercial
advantage for  vendors, and may pose an existential threat
to open systems. 


1 Introduction 

It is frequently argued in the open source and free software
community that  making source code available to all is good
for security. A large community of  users and experts can
pore over the code and find vulnerabilities: `to many eyes, 
all bugs are shallow' [1]. In the crypto community in
particular, it has been  standard practice since the
nineteenth century that the opponent knows the  design of
your system, so the only way you can keep him out is by
denying him  knowledge of a temporary variable, the key [2].
On the other hand, opponents  of open source software argue
that `if the software is in the public domain, then 
potential hackers have also had the opportunity to study the
software closely to  determine its vulnerabilities' [3]. So
whom does openness help more, attack or  defence? 

This appears to be the kind of question that is in principle
amenable to  a clear answer, on the basis of accepted models
of software reliability growth,  empirical data about
reported vulnerabilities, or (preferably) both. 

The question is much more general than whether software
source code should  be available to users. A wide range of
systems and components can be either  easier, or more
diffcult, to test, inspect and repair depending on the
available  tools and access. Object code is partly
accessible for inspection, through debugging and disassembly
tools, but may be diffcult to patch. Hardware devices,  even
those designed to be tamper­proof, can often be reverse
engineered with  surprisingly little effort -- although the
capital resources needed to fabricate a  compatible clone
might be scarce. I also do not wish to take sides in the
`open  source' versus `free software' debate. So in what
follows I will consider `open systems' versus `closed
systems', which differ in the degree of difficulty in
finding a  security vulnerability. 

< MAJOR SNIP -- SEE PAPER >

4 Real World Problems 

Information security is about money and power; it's about
who gets to read,  write, or run which file. The economics
of information goods and services markets  is dominated by
the establishment and defence of monopolies, the
manipulation  of switching costs, and the control of
interoperability [14]. It should surprise  no­one that the
most rapidly growing application area of information
security  mechanisms is in the support of such business
plays. For example, some mobile  phone vendors use
challenge­response authentication to check that the phone 
battery is a genuine part rather than a clone -- in which
case, the phone will refuse  to recharge it, and may even
drain it as quickly as possible. The Sony Playstation  2
uses similar authentication to ensure that memory cartridges
were made by  Sony rather than by a low­price competitor --
and the authentication chips also  contain the CSS
encryption algorithm for DVD, so that reverse engineers can
be  accused of circumventing a copyright protection
mechanism and hounded under  the Digital Millennium
Copyright Act. 

4.1 TCPA 

In this brave new world, it is surprising that so few have
paid attention to the  Trusted Computing Platform Alliance
(TCPA), an initiative led by Intel whose  stated goal is to
embed digital rights management technology in the PC [15]. 

TCPA provides for a monitoring component to be mounted in
future PCs.  The standards document is agnostic about
whether this is in the CPU (the  original Intel idea), the
O/S (the Microsoft idea) or a smartcard chip or dongle 
soldered to the motherboard. For simplicity, I'll assume the
last of these, and  call the chip `Fritz' for brevity, in
honour of Senator Hollings, who is working  tirelessly in
Congress to make TCPA a mandatory part of all consumer
electronics. 

When you boot up your PC, Fritz takes charge. He checks that
the boot ROM  is as expected, executes it, measures the
state of the machine; then checks the  first part of the
operating system, loads and executes it, checks the state of
the  machine; and so on. The trust boundary, of hardware and
software considered  to be known and verified, is steadily
expanded. A table is maintained of the  hardware (audio
card, video card etc) and the software (O/S, drivers, etc);
if  there are significant changes, the machine must be
re­certified. 

The result is a PC booted into a known state with an
approved combination  of hardware and software [16]. Once
the machine is in this state, Fritz can prove  it to third
parties: for example, he will do an authentication protocol
with Disney  to prove that his machine is a suitable
recipient of `Snow White'. The Disney  server then sends
encrypted data, with a key that Fritz will use to unseal it.
Fritz  makes the key available only so long as the
environment remains `trustworthy'. 

The rules governing which application can use which data can
be quite expressive -- and imposed remotely. Music and video
vendors can sell content only  to those PCs they trust not
to rip it. In government applications, I expect the  idea is
to implement a version of Bell­LaPadula, so that government
information  can `leak' only through a small number of
trusted subjects; if you are not so  trusted, you will be
unable to post a file containing classified information to
a  journalist, as his Fritz will not give him the necessary
key. 

4.2 The digital commons and the threat to open systems 

There are potentially serious issues for consumer choice and
for the digital commons [17]. If TCPA­compliant PCs become
the majority, and if TCPA­enabled  applications prevail in
the marketplace, then I expect there will be another
segmentation of the platform market similar to that between
Linux and Windows  now. If you boot up your PC in an
untrusted state, it will be much like booting  it with Linux
now: the sound and graphics will not be so good, and there
will  not be nearly so many applications. Worse, while at
present it can be slightly  tiresome for a Linux user to
read Microsoft format documents, under TCPA Microsoft can
prevent this completely: rather than messing around with
file formats  so that existing competitor products have to
be upgraded before they can read  them, Microsoft (or any
other application owner) can cause data to be encrypted 
using TCPA keys whose export they control completely. 

When pressed on TCPA, the explanation from Intel managers is
that they  want the PC to be the central information
appliance in the home, acting as a  hub for the CD player,
the TV, the fridge and everything else. If DRM becomes  an
intergal part of entertainment, and entertainment's the main
domestic app,  then the PC must do DRM, or the set­top­box
might take over from it. (Other  vendors have different
stories.) 

TCPA raises many issues. What are the governance
arrangements of the  Alliance? How will its products and
policies interact with the new copyright regulations due in
EU countries under the recent Copyright Directive? Britain,
for  example, seems reluctant to impose any duties on the
vendors and beneficiaries  of DRM technology, while eager to
smooth their path by removing as many of  the existing fair
use exemption as the EU will permit. The sort of practical
question we are likely to be faced with this year is whether
the blind will still be able  to use their screen scrapers
to read e­books. And what about the transparency  of
processing of personal data enshrined in the EU data
protection directive? 

What about the simple sovereignty issue, of who will write
copyright regulations  in Europe in future -- will it be the
European Commission assisted by national  governments, as at
present, or an application developer in Portland or
Redmond?  And will TCPA be used by Microsoft as a means of
killing off competitors such  as GNU/Linux and Apache, which
being maintained by networks of volunteers  cannot simply be
bought out and closed down? 

4.3 Competition policy issues 

Perhaps the most serious issues on the macro scale, though,
have to do with  competition policy and the development of
markets for information goods and  services. If the owner of
an application has total control in future over who can  use
the data users generate with it, and change the rules
remotely, then this  can have many serious effects.
Compatibility between applications will be able  to be
controlled remotely, and with a subtlety and strength of
mechanism that  has never been available before. So it
creates the power to magic all sorts of  new monopolies into
existence, and to abolish the right to reverse engineer for 
compatibility. The implications of this are broad; for
discussion, see for example  Samuelson and Scotchmer [18]. 

The conventional antitrust approach has been to smile on
standards developed in open fora, while being suspicious of
proprietary standards. This has  already shown itself to be
inadequate in the 1930s, when IBM and Remington  Rand
maintained a choke­hold on the punched card market by patent
licensing  agreements. Since then a significant literature
has developed [19, 20]. 

The Intel modus operandi appears to take this to new
heights. Gawer and  Cusumano describe how Intel honed a
`platform leadership' strategy, in which  they led a number
of industry efforts -- the PCI bus, USB and so on [21]. 
The  positive view of this strategy was that they grew the
overall market for  PCs; the dark side was that they
prevented any competitor achieving a dominant  position in a
technology that might have threatened Intel's dominance of
the PC  hardware. Thus, Intel could not afford for IBM's
microchannel bus to prevail,  not just as a competing nexus
of the PC platform but also because IBM had  no interest in
providing the bandwidth needed for the PS to compete with
high  end systems. Presumably also Intel did not want one of
the security vendors  to establish leadership in DRM
technology once it seemed plausible that DRM  could become
the key component for home computing: the home platform
should  be a ``PC'' rather than a ``device compliant with
Acme's DRM standard''. 

Intel's modus operandi is to set up a consortium to share
the development of  the technology, have the founder members
of the consortium put some IP into  the pot, publish a
standard, get some momentum behind it, then license it to
the  industry on the condition that licensees in turn
cross­license any interfering IP of  their own, at zero
cost, to all corsortium members. The effect in strategic
terms  is somewhat similar to the old Roman practice of
demolishing all dwellings and  cutting down all trees within
two bowshots of a road, or half a mile of a castle. 

No competing structure may be allowed near Intel's platform;
it must all be  levelled into a commons. But a nice,
orderly, well­regulated commons: interfaces  should be `open
but not free'. 

Maybe one can see the modus operandi used with TCPA (and PCI
bus, and  USB) as the polished end result of evolution ­ of
a package of business methods  that enable a platform leader
to skirt antitrust law. It also diffuses responsibility,  so
that when TCPA eventually does get rolling there is no
single player who takes  all the media heat -- as Intel
earlier did with the Pentium serial number.  TCPA is not
vapourware. The first specification was published in 2000,
IBM  sells laptops that are claimed to be TCPA compliant,
and some of the features  in Windows XP and the X­Box are
TCPA features. For example, if you change  your PC
configuration more than a little, you have to reregister all
your software  with Redmond. 

What will be its large­scale economic effects? 

Suppose you are developing a new speech recognition product.
If you TCPAenable it, then on suitable platforms you can
cause its output to be TCPAprotected, and you can remotely
decide what applications will be able to read  these files,
and under what conditions. Thus if your application becomes
popular, you can control the complementary products and
either spawn off a series  of monopolies for add­ons, or
rent out access to the interfaces, as you wish. 

The gains to be made by a company that successfully
establishes a strong product are greatly enhanced. I expect
that venture capitalists will insist that their  investments
be TCPA­protected. However, as the few winners win bigger,
the  proportion of winners overall may diminish, and as
interfaces to existing data  become rented rather than free,
the scope for low­budget innovators to create  new
information goods and services rapidly will be severely
circumscribed. 

In other words, TCPA appears likely to change the ecology of
information  goods and services markets so as to favour
incumbents, penalise challengers,  and slow down the pace of
innovation and entrepreneurship. It is also likely to 
squeeze open systems, and may give rise to serious trade
disputes between the  USA and the EU. 


5 Conclusion 

The debate about open versus closed systems started out in
the nineteenth  century when Auguste Kerckhoffs pointed out
the wisdom of assuming that  the enemy knew one's cipher
system, so that security could only reside in the  key. It
has developed into a debate about whether access to the
source code of  a software product is of more help to the
defence, because they can find and fix  bugs more easily, or
to attackers, because they can develop exploits with less 
effort. 

This paper answers that question. In a perfect world, and
for systems large  and complex enough for statistical
methods to apply, the attack and the defence  are helped
equally. Whether systems are open or closed makes no
difference in  the long run. The interesting questions lie
in the circumstances in which this  symmetry can be broken
in practice. 

This leads naturally to a second point. The interaction
between security and  the openness of systems is much more
complex than just a matter of reliability.  The heart of the
matter is functionality -- what should a secure system do?
What  does security mean to a platform vendor? 

Seen in these terms, the answer is obvious. While security
for the user might  mean the repulse of `evil hackers on the
Internet', whoever they might be, security for the vendor
means growing the market and crushing the competition. 

The real future tensions between the open and closed system
communities will  be defined by struggles for power and
control over standards. TCPA adds a new  twist to the
struggle. Even if data standards achieve `XML heaven' of
complete  openness and interoperability, the layer of
control will shift elsewhere. Instead  of being implicit in
deviously engineered file format incompatibilities, it will
be  overtly protected by storng cryptography and backed up
by the legal sanctions  of DMCA. 

Although TCPA is presented as a means of improving PC
security and helping users protect themselves, it is
anything but. The open systems community  had better start
thinking seriously about its implications, and policymakers 
should too. 

Acknowledgements: Thanks for useful comments to Richard
Clayton, Paul  Leach, Hal Varian, Pater Wayner, Fabien
Petitcolas, Brian Behlendorf, Seth  Arnold, Jonathan Smith,
Tim Harris and Andrei Serjantov. Mike Roe commented that
this paper could be split into two, one on reliability
growth and the  second on TCPA. This may be fair comment;
however, I'd accepted an invitation  from the Open Source
Software Economics conference to give an opening talk  on
TCPA, and the previous week I discovered the result
described in the first  part of this paper, which is just
too appropriate not to include.