Skip to the main content
Assembly Project Fellowship Showcase

Assembly Project Fellowship Showcase

An inside look at the work of the 2021 Assembly Fellows on Disinformation, Privacy, and Ethical AI

Download original video and audio from this event

Read the transcript

🎉  This January, to celebrate the fifth year of the Assembly program, we invited five ongoing alumni project teams to return (virtually) to the Berkman Klein Center to participate in the Assembly Project Fellowship. These projects address the complex technology and policy problem spaces that Assembly has tackled since its inception in 2017: the future of digital security, the ethics and governance of AI, and most recently, disinformation

✨ On Monday, May 17th at 4pm ET at the program’s final showcase, the returning Assembly projects—Clean Insights, Cloak & Pixel, the Data Nutrition Project, AI Blindspot, and Disinfodex, all moderated by Professor Jonathan Zittrain—shared provocations, tools, and other outputs they have developed over the course of the past few years and during the 2021 Project Fellowship. Read more about the five projects below.

🧩 Each project is driven by complex questions, represented here as riddles.đź’ˇ Join us for the showcase to learn more and untangle the answers!: 

  • What takes what you need but nothing more, to give the knowledge you want, without gathering a hoard?
  • What hides you so you can be seen?
  • What’s something you don’t eat that can still be nutritious?
  • We all have it. It looks like a blur. If we don’t spot it, harms will occur.
  • Where do you go to find what you don't want to know?

Logos for the five Assembly 2021 Fellowship Projects: Clean Insights, equalAIs, the Data Nutrition Project, AI Blindspot, and Disinfodex

Clean Insights

What takes what you need but nothing more, to give the knowledge you want, without gathering a hoard?

In 2017, concerned about privacy and security failures, Assembly Fellows collaborated to create Clean Insights: a secure, private measurement platform that is focused on answering key questions, instead of enabling invasive surveillance. Clean Insights is now a project housed at Guardian Project, a grant-funded mobile software collective that develops privacy-enhanced software and services with a focus on human rights and humanitarian needs. 

  • Team: Nathan Freitas
  • Advisor: Professor Margo Seltzer, Berkman Klein Center Director and University of British Columbia 

Cloak & Pixel

What hides you so you can be seen?

In 2018, concerned by the unregulated application of facial recognition technologies, a group of Assembly Fellows collaborated to create EqualAIs, a digital mask for photos that fights pervasive surveillance and helps protect civil liberties. Evolving from that project, a subset of the team developed Cloak & Pixel in 2021, which aims to demonstrate that individuals can and should be able to express consent or non-consent beyond the levels of capacity provided by 3rd parties that track individuals’ data.

  • Team: Gretchen Greene, Thomas Miano, and Daniel Pedraza
  • Advisor: Professor Jonathan Zittrain, Berkman Klein Center Director

Data Nutrition Project

What’s something you don’t eat that can still be nutritious?

In 2018, drawn together by an overlapping interest in improving artificial intelligence from the perspective of data governance and the harms caused by poor training data in models, Assembly Fellows created the Data Nutrition Project. DNP encourages the responsible development of artificial intelligence by creating tools and practices, including a Dataset “Nutrition Label” that explains what is inside a dataset before it is used to train a machine learning model. 

  • Team: Kasia Chmielinski, Josh Joseph, Sarah Newman, Matt Taylor, Kemi Thomas, and Jessica Yurkofsky
  • Advisor: Professor Mary Gray, Microsoft Research and Indiana University

AI Blindspot

We all have it. It looks like a blur. If we don’t spot it, harms will occur.

After seeing the rapid pace at which machine learning systems were being implemented in both the public and private sectors, Assembly Fellows formed the AI Blindspot team in 2019. They aim to dissolve the barriers between those who build AI systems and those who don't — demystifying the ways in which AI systems might be harmful to vulnerable communities and reducing the burden of understanding the impacts of automated decision making systems. 

  • Team: Ania Calderon, Hong Qu, Dan Taber, and Jeff Wen
  • Advisor: Kade Crockford, ACLU of Massachusetts

Disinfodex

Where do you go to find what you don't want to know?

Recognizing the need for more centralized information about disinformation campaigns taken down by major online platforms, a group of Assembly Fellows in 2020 built Disinfodex. Disinfodex is a database of publicly available information about disinformation campaigns, and currently includes disclosures issued by major online platforms and accompanying reports from independent open source investigators.

  • Team: Jenny Fan, GĂĽlsin Harman, Rhona Tarrant, Ashley Tolbert, Neal Ungerleider, and Clement Wolf
  • Advisor: Professor James Mickens, Berkman Klein Center Director
Past Event
May 17, 2021
Time
4:00 PM - 5:30 PM ET

You might also like


Projects & Tools 01

Past

Assembly: Disinformation

The Assembly: Disinformation Program brings together participants from academia, industry, government, and civil society from across disciplines to explore and make progress on…