Skip to the main content

Get To Know 23-24 BKC Fellow: Diana Freed

Dr. Diana Freed is a joint 2023-2024 Fellow at the Berkman Klein Center and a fellow at the Center for Research on Computation and Society at the Harvard John A. Paulson School of Engineering and Applied Sciences. She researches the intersection of technology and society, focusing on security and privacy, human-computer interaction, digital literacy, behavioral health, and technology policy to improve online safety and well-being for vulnerable and marginalized populations. She received her Ph.D. in Information Science from Cornell University and is a 2023-2024 incoming Assistant Professor at Brown University in the Department of Computer Science and the Data Science Institute. 

Tell me a bit about the project or the projects that you're hoping to do some work on during your fellowship time. 

I currently work on youth digital safety, and I'm also working on projects related to telehealth and digital health as it pertains to vulnerable and underserved populations. What I'm interested in continuing to work on are these research areas, looking at technology-facilitated abuse in the context of youth, older adults, intimate partner violence, as well as how this crosses over into areas of telehealth [and] in person health and how people navigate these systems, both in the physical and digital worlds, as they receive services [and] get support and care. 

You mentioned the ideas of digital health and telehealth. I know ‘telehealth’ has entered the public lexicon a lot more in recent years because of the, at times, forced shift into it during the pandemic. Could you tell me a little bit more about what digital health means to you? 

Digital health, to me, means the way that people use tools and technologies to manage their health and well-being, communicate with health professionals, and receive services. So I think that it allows for independent care as well as care as part of systems, depending upon what tools are being used. 

And [digital health includes] how those tools are being used. We see extensions of care, such as cameras in nursing homes, which are not necessarily digital health, however, if they’re allowing someone to communicate with a loved one in a care facility, do these sorts of tools then become part of a broader category of digital health tools?  

I tend to think of [digital health] from a very open-minded perspective, especially as this area expands outward into areas of immersive health, such as AR and VR, and as those tools are being added into sort of the toolbox of healthcare services. 

On a similar note then, do you think that the kind of reliance on immersive health tools that the pandemic brought about was kind of inevitable in that even without a pandemic, they would’ve come about? 

What [the pandemic] did in terms of my own research is it allowed for advocates and people delivering healthcare to move in this direction perhaps a little bit more quickly than they might have, and as a result of that, it made healthcare a little more accessible, especially [to] some of the populations that I work with. 

And so I think that [the pandemic] provided access and expedited care delivery for many people who don't have immediate access or who have limitations in terms of accessing care. And it's also enabled providers to think about how they manage care delivery. I think some of that remains to be figured out in terms of what remains remote care, what returns to in-person care, and what [healthcare] practice looks like for different populations. 

You’ve done a lot of on-the-ground, public advocacy work. Have you encountered any challenges or tensions in bringing this work together with your research? 

With a lot of the work in terms of technology abuse, you need to learn from what people are experiencing, how they understand the technology, what's going to help them figure it out. And that work is often client centered, so they're driving it. I can develop something that might be very useful, but if it's not right for a population, or people don't want to use it or it's not what they're looking for, then there's a mismatch. So I try to pay close attention to delivering tools and services that can be helpful to both the research community as well as people that I'm working with.  

I would say that the challenge is just in terms of timelines. It takes a long time to do that kind of work, both seeking IRB [Institutional Review Board] approval for working with survivors or people dealing with difficult situations, and you're working around people's lives and schedules and things of that nature. So, it just takes a little bit longer than working with a data set, for example.  

I look at my work very much in terms of ecosystems. I work with a lot of at-risk groups, you know, [and imagine a member of an at-risk group] as the center of an ecosystem and ask: who else is that person interacting with?  

I try to go into the place that that person is. Not literally, but in the sense of understanding the problem from their perspective, from their ecosystem.  

I really like that word, ecosystems. I've done a little bit of work on values in science, namely concerned with the championing of objectivity in science. I think that when people try to promote objectivity as the sole value in science, they often end up doing this thing where they envision objectivity as you have the scientist looking down upon the world. It seems like the ecosystems approach subverts that very intentionally, and asks, instead of this one-directional researcher’s gaze looking down at the target population, what would it look like to not look down, but to kind of look outward almost? Does that seem right?  

Yes, because for the work that I do, it's always trying to understand what's going on in the ecosystem. So if I'm looking at it from a survivor's perspective, part of what I'm learning about is what is of concern to the [abuse] survivor. If they're trying to receive support from an advocate [but] there might be an abuser or attacker that's trying to oppose that effort, they might feel like their location is being tracked. We might be trying to solve that, but it might not be safe to do so. And so you need to understand, as a researcher, how to work in ways that are focused on safety. And so I work closely to make sure that my work is not only aligning with the needs of the survivor, but doing everything safely, making sure all the checks and balances are there. 

Most of my research is from a multi-stakeholder perspective. If I'm working with children and need to understand if they're in social services or there are parents or people of concern in their lives, I need to ask what the schools are doing. I take into account a lot of different factors to try to figure out how these different stakeholders or communities that they're interacting with might impact a tech safety issue. 

Interviewer 

Erica Bigelow is a Philosophy Ph.D. student at the University of Washington in Seattle. During the summer of 2023, she interned with the Berkman Klein Center’s communications team and the Cyberlaw Clinic. Her research interests include feminist philosophy, disability, emotions, and social epistemology.