This interview is part of a collaborative effort between the summer 2018 BKC interns and the Communications team to showcase the tremendous work and backgrounds of our 2018 -'19 BKC fellows
I had the pleasure of interviewing Armando Guio-Espanol, a fellow for the Youth and Media team. Armando is a Colombian lawyer, with an L.L.M. from Harvard, and a M.P.P. from Oxford. Armando is largely concerned with policy making the Global South, specifically in understanding how Latin American youth can seize employment opportunities that technology and the modern digital economy can provide.
Armando and I discussed the nuances of his research proposal as well as the broader themes of his work. It struck me that while Armando’s ideas incorporate broader institutional and systemic themes that are often discussed at the BKC, they are also reflective of a deeply interpersonal understanding of the region, peoples, and areas in which he hopes to inspire change.
While most of Armando’s fellowship proposal centers around gathering empirical evidence of the digital economy, one of the most fascinating parts of the interview was his vision to create a version of the Digital Asia Hub centered around the region of Latin America. The Digital Asia Hub is an independent think tank that with a focus on internet issues and policy that was, until recently, incubated within the Berkman Klein Center. This hypothetical ‘Latin American’ Digital Hub would fully focus on the potential themes and issues of this region, while also further exploring the policy nuances of each respective country.
The Berkman Klein Center is a place that's innovating a lot about regulation. They have a great approach to developing countries. I know that the Berkman Klein Center is for me the best Center in the world for studying this stuff.
Armando thank you so much for joining me in this interview and congratulations on your Fellowship.
I'm a Colombian lawyer. I have always been interested in new technology and innovation and how legal theory is going to affect the way we engage with these technologies. I have been working especially in the area of privacy and data protection. Those were my topics of interest. And I decided to go to Harvard to study my LLM in 2015.
Then I worked for a time providing legal advice on privacy matters. I got very interested in this stuff especially now with artificial intelligence, robotics, how is this going to affect privacy law and privacy regulation around the world. And then I decided that I wanted to have a perspective from the public policy approach, and understand the public policy concerns and discussions that will happen around new technology. And that's why decided to go to Oxford to study my MPP.
And then I said "I know the best place to continue with all this work will be the Berkman Klein Center!" It's a Center that I always have admired because of the fellows they have, the professionals they have, their staff. I had the opportunity to meet a lot of of the people from the from the Center when I was an LLM student. I think this is a great place.
And I know that it's a place that's also innovating a lot about regulation. And they have a great approach to developing countries that's not something that I have seen in many other centers. They're really concerned about the effects of new technologies for developing countries like Colombia and all of Latin America. They have this project Conectados al Sur, which is connected to the Global South. For me it's a great initiative and it's quite innovative because I have not seen something like that in other centers and I know that the Berkman Klein Center is for me the best Center in the world for studying this stuff. I'm so happy to have been accepted.
I want to turn this to your own individual goals at Berkman, your fellowship proposal, can you tell me more about that?
I want policy that is based on good evidence. I'm concerned about youngest populations in Latin American countries. We have a huge unemployment rate for these populations, young people, they enter studies in high school or some of them they're going to get college degrees, and then they have no work.
In Colombia we have 600,000 people that are in the condition in which they did not study or work. So they easily engage in illegal activities and illegal groups. And I think that technology has something to do there, especially because it can provide opportunity. But at the same time it can close opportunities.
And so my focus is to understand how are we helping these people to engage with the new technologies? Are we allowing them to interact with those technologies, or actually because we are so concerned about security, privacy, a lot of things, are we limiting the access of these people to technology?
For example, you may know that a lot of countries have limitations about the use of apps, and how you provide your data, and some absence of webpages. In the GDPR they were discussing having 16 years old be the limit for people to use Facebook or WhatsApp. I think that's good if you want to protect these kids. But at the same time you also jeopardize their future and the way they engage with new technologies, and that has huge implications for digital literacy.
And that's why I'm so concerned that we have good evidence in order to protect these populations, especially the young population that wants to engage with technology. We are concerned because there are a lot of threats on the Internet. And my concern is are we actually doing this based on good evidence to protect them? And it's worth it to protect them? Or is it because we have seen terrible things in the news?
So that's why I work on the impact that innovation, protection, security, privacy can have on access to technology and to improve digital literacy. Because if these younger generations are not prepared to engage with technology then it's a huge risk, especially for developing countries, that have a whole generation that is not engaged with technology in education, in high school, and it will be impossible for us to catch up with developed countries.
If younger generations are not prepared to engage with technology then it's a huge risk, especially for developing countries, that have a whole generation that is not engaged with technology in education, and it will be impossible for us to catch up with developed countries.
You said a lot of these young people in marginalized groups, they never have the opportunity to engage with technology. But I also think about how technology is also not absent in marginalized people's lives. A lot of times there is technology in these people's lives, but it is used in different ways. Technology may be used to track them and to limit access to resources, and the benefits are more distributed to people who live in positions of power and privilege. How do you look at the situation and say "what about this works? What about this is something that they use to circumvent technology and works in that community and how can we aid that technology?"
We are not allowing this population to engage with technology as much as we can because we're afraid that they will in some way suffer when they experience technology at a young age. So my point there is if you limit the access to technology you can be promoting without even knowing a social control policy. This class cannot engage with technology because they will find a lot of things, they will have access to a lot of the information that we don't want them to have.
Especially in developing countries, I'm concerned that we're going to have social control and we're trying to have some groups that are completely marginalized as they have been for years, because of historical circumstances, but also because of their access to technology.
We are not saying "OK, let's marginalize these people." No, we say "we want to protect their security and privacy because we don't know what they are going to do. We have seen a lot of terrible things going on on the Internet, let's try to protect them." But actually what you're doing is perpetuating that social control. And that's what I would like to understand, and look at the evidence and see actually are we protecting them? Perhaps we are protecting them! And that's something we want! But I want to have enough evidence to say that.
There are a lot of studies in the U.S. and Europe, and because of the GDPR, having the 16 year old age limit for Internet usage as what they recommend to countries. It was so interesting because they said 16 years old should be the limit, in order to provide for our concerns about privacy issues. And you have countries in the European Union that decided to say "13 years old, 14 years old, 15 years old." So can you try to provide evidence that 16 years old is the best? Every single country is saying something completely different. So you there's no evidence.
And I want to discuss that also in developing countries. I think at the Center we can work on that, we can work to help marginalized groups have more access to technologies, to understand the effect and the importance of digital literacy. Because one of my concerns is the fact that technology is being used by governments, by private entities, but not that much by citizens. Actually we're taking a lot of data from citizens. All the data mining activity. But we're not telling people, "hey, you can use this data, you can also collect data." We are not empowering people.
There are people who are concerned about the Global South for example, if you are collecting data from the Global South on websites located in the U.S. and Europe, what are we going to get from all this? Because you're actually collecting data from developing countries from the Global South but we're not gaining anything from that. And that's something that is being discussed in trade agreements now, what are these marginalized groups getting from all the data we're getting from them in the U.S. Are we using this for good public policy, to improve regulation, to improve life conditions? Or actually is it some means of social control, just to control the society. And that's why I think all this research is looking for that question of how to empower people, especially about their data. How is this going to change our approach to privacy? And to data protection?
I can sense a tension in what you're talking about, which is like on the one hand, youth can create their own jobs, they can become entrepreneurs, they can become millionaires on the Internet. But at the same time realize that corporations and the big data that they are using are creating this new proletariat, where they are becoming this class of unpaid laborers and unpaid workers that are being exploited by corporations. Can you delve into that dichotomy a little bit?
Corporations' main concern is about profit, and that's what they want to do and I think that that's fine, and that's important for economy and for society as a whole for corporations to profit from their activities, to improve lives. Of course as with any other entity they also need to be regulated at some point. And that's what we are discussing now.
There are a lot of externalities and things they do that cause some harm. Not intentionally, but just because of things they are doing. And that's why regulations should try to correct these bad externalities. That's the importance of regulation. And I think corporations are not asking for deregulation at some point, but asking for good regulations. And that's the challenge, to balance all these things. And to say "OK can we innovate, but also look at the social consequences of this kind of innovation?"
For example, there are a lot of people concerned about unemployment and artificial intelligence. How is that going to affect all of us? We should try to balance. Corporations, social groups, marginalized groups, how are we going to interact to improve? And that's the challenge, and we'll have a lot of years or decades of thinking about this before we finally get into some something good, but we still need to think about it.
What we need to do is change the mindset of people who say "there's no privacy. This is the end of privacy. I don't care about privacy because I'm not an important individual, so I'm not concerned about that. I'm not afraid of anything. I'm an open book." What we should try to understand is that privacy goes beyond just surveilling. For example, profiling can be quite dangerous, especially the criteria that is used to profile people. How are financial institutions going to profile people, governments. All these profiling activities are quite dangerous, and we have to set some limits there. Bias as well, for example, we are discussing algorithms that are essential for artificial intelligence. Those biases, personal biases that go into those algorithms, that are processing with biases incorporated, that can be quite dangerous.
So we have to be very careful about the ethical considerations, what we are doing, how are we trying to design technology. And that's why I think regulations are essential. And I think even corporations would say "yes, we need regulation," because otherwise things will be disastrous for them. So that's why we need to set some control.
I think that we have this romantic idea about innovation being promoted because there was no regulation. I think innovation was promoted in a lot of countries because there was proper regulation that allowed them to innovate. Not because there was no regulation at all. We have to be a little bit careful with all the romantic ideas about science and technology, that, "oh there were no controls, and scientists could experiment in any way they wanted." Actually it doesn't work like that. I think they all will agree that some control is still necessary, especially because of the social impact.
We have to be very careful about ethical considerations of what we are doing, how we are trying to design technology. And that's why I think regulations are essential.
I want to go back to something you said about privacy, because it was almost exactly something my own father would say, where you said "oh it doesn't matter, I'm an open book, I don't care if people are looking." So I am South Asian. And there's no word for "privacy" in many of the South Asian languages. And this is an interesting distinction between the West and the Global South. That communities here (in the US) are made up of individuals. But communities for example in South Asia are made up of other smaller communities.
So I kind of want to go back to you saying about how we have to educate the public to understand the issues and we have to understand privacy better. How do you grapple with that notion in your work? How is this indicative of broader challenges in policymaking in the Global South?
We have been discussing this idea about, is privacy actually a universal human right? Something that every single culture and country is going to adopt and is truly concerned about? And I think that that's one of the things when we discuss international trade agreements we're discussing if privacy is a Western right or something that is a concern for the world? We get into a very complex scenario, but it's an area that we have to study.
Someone who has been studying this for example is Amartya Sen. He has been challenging this idea that rights actually come from the West, that many Asian countries like India for example have their own approach to these rights. But at the end what he says is that actually we have an understanding of freedom, but we have a concern about freedom, as any other civilization and any other country and culture around the world. We have to understand that we cannot pose the same concerns about privacy that the Europeans have, that the North Americans have, that the Latin Americans have. We all have our own approach.
But at the end I think privacy is something that can be universal. We don't have the same historical perspective or approach to these things. But at the end I think there's something universal about our information being protected, and in some way being under control. One of my main concerns is that Latin America that we have our own approach. And that will be essential for an international discussion to have some day an international agreement on privacy and digital rights that is comprehensive and includes a lot of countries. But I think that those discussions will start in every single jurisdiction in every single country from the national level, then we will take them to the international level.
What do you hope to be your main output at the end of this fellowship?
I would love to work on gathering evidence. As we have been discussing, we have a lot of the theory, but we still need a lot of evidence that will support the theories that we're trying to promote. I'd like to look for data sets, for information about how technology is used in developing countries. We have lots of stories but we actually have to gather some more evidence that will allow policymakers to understand the things that they are doing.
I think that it's important for what comes next especially for policymakers. We need centers like the Berkman Klein Center, and like the Digital Asia Hub, which has been very important in Asia in a lot of discussions. And that's what I want for Latin America. What the Digital Asia Hub has been doing with the support of the Berkman Klein Center, they designed this hub where they are having discussions about new technologies in Asia, and they're helping policymakers, and organizing events for policymakers to interact with academia, with the private sector, with different institutions that are concerned about regulation of new technologies and the Internet in Asia. That has been quite innovative. And that's something I would like to see in Latin America. To have collaboration among individuals in all of Latin America, because as in many other regions there's a lot of division among the countries, a lot of differences. There's a lot of nationalism. So the idea is to collaborate, put an end to that and say let's try to collaborate as a region. Because we all share these concerns about access to information, or privacy rights, freedom on the Internet. I hope that in Colombia and any other country in South America that we can have a group of people working together thinking about policy, how to support policymakers, which discussions are relevant now, how can we improve things in the region.
We've been talking about some very intelligent things, but I wondered if you wanted to share anything funny or random about yourself!
I'm Colombian, so I love my country! I love football! We were really excited about our national team and the World Cup. A little bit sad. They played so well. But if anyone wants to discuss with me about football, I am a great fan of the European Football League, I really enjoy that. I'm married for two years now, with my wife Juliana and we are excited because she's going to study her LLM at NYU. So it's a really nice time for us. We love to make friends all around the world. What else can I tell you? I love TV series, I'm addicted to Netflix. I love Game of Thrones. I loved the Office! It will be a really nice time at the Berkman Klein Center, for me it's like going back home. I always have considered Harvard like home. I studied there for a year, my family studied there, my sister studied there. So we're really connected with Harvard. This is something that was really funny because I had my Harvard sweater when I was at Oxford and people we saying "why are you wearing that, you're at Oxford!" And I said "it will always be Harvard for me, you'll have to accept that!" Colleges create this sense of belonging, that you're part of the university for the rest of your life, and that's something that I always appreciate. And that's why I'm so excited about being there this year.
This interview has been edited and condensed for clarity.
Tanvi, originally from New Jersey, is a high-school senior at Phillips Academy, a boarding school north of Boston. During the summer of 2018, she worked with the Youth and Media team, where continued previous work in the field of Global Health policy and the intersection of social media. She also focuses on how technology shapes student activism.