Skip to the main content

Second Berkman/St. Gallen Workshop on ICT Interoperability

From Berkman Fellow Urs Gasser...

[Earlier this week], I had the pleasure to co-moderate with my colleagues and friends Prof. John Palfrey and Colin Maclay the second Berkman/St. Gallen Workshop on ICT Interoperability and eInnovation. While we received wonderful initial inputs at the first workshop in January that took place in Weissbad, Switzerland, we had this time the opportunity to present our draft case studies and preliminary findings here in Cambridge. The invited group of 20 experts from various disciplines and industries have provided detailed feedback on our drafts, covering important methodological questions as well as substantive issues in areas such as DRM interoperability, digital ID, and web service/mash ups.

Like at the January workshop, the discussion got heated while exploring the possible roles of governments regarding ICT interoperability. Government involvement may take many forms and can be roughly grouped into two categories: ex ante and ex post approaches. Ex post approaches would include, for example, interventions based on general competition law (e.g. in cases of refusal to license a core technology by a dominant market player) or an adjustment of the IP regime (e.g. broadening existing reverse-engineering provisions). Ex ante strategies also include a broad range of possible interventions, among them mandating standards (to start with the most intrusive), requiring the disclosure of interoperability information, labeling/transparency requirements, using public procurement power, but also fostering frameworks for cooperation between private actors, etc.

- continued -

There was broad consensus in the room that governmental interventions, especially in form of intrusive ex ante interventions, should be a means of last resort. However, it was disputed how the relevant scenarios (market failures) might look like where governmental interventions are justified. A complicating factor in the context of the analysis is the rapidly changing technological environment that makes it hard to predict whether the market forces just need more time to address a particular interoperability problem, or whether the market failed in doing so.

In the last session of the workshop, we discussed a chart we drafted that suggests steps and issues that governments would have to take into consideration when making policy choices about ICT interoperability (according to our understanding of public policy, the government could also reach the conclusion that it doesn’t intervene and let the self-regulatory forces of the market taking care of a particular issue). While details remain to be discussed, the majority of the participants seemed to agree that the following elements should be part of the chart:

1. precise description of perceived interoperability problem (as specific as possible);

2. clarifying government’s responsibility regarding the perceived problem;

3. in-depth analysis of the problem (based on empirical data where available);

4. assessing the need for intervention vis-à-vis dynamic market forces (incl. “timing” issue);

5. exploring the full range of approaches available as portrayed, for example, in our case studies and reports (both self-regulatory and regulation-based approaches, including discussion of drawbacks/costs);

6. definition of the policy goal that shall be achieved (also for benchmarking purposes), e.g. increasing competition, fostering innovation, ensuring security, etc.

Discussion (and research!) to be continued over the weeks and months to come.

For more on Urs's work with Interoperability, Digital Natives, and more visit his blog.

You might also like


Projects & Tools 01

Past

Interoperability

In early June 2012, Urs Gasser and John Palfrey released Interoperability: The Promise and Perils of Highly Interconnected Systems. The book is inspired by their 2005 study and…