Data for Good

Used responsibly, data and technology can create the foundation for more equitable access to opportunity, greater accountability, and a stronger democracy.

We’re working towards an era when decision-making is driven by better data and AI systems guard against discrimination and bias.

pile of online ads

Protecting information and election integrity

As election-hacking scandals, documented disinformation campaigns, and the exposure of deepfake videos have shown, we are facing regular digital threats to our social fabric. Cybersecurity for Democracy is a research organization dedicated to uncovering and combatting those threats — whether they involve nation-state interference in U.S. voting systems, discriminatory policies for online advertising, or the viral spread of untruths about vaccines. 

With confidence in our democratic processes, the integrity of our social systems, and public health at stake, researchers at Cybersecurity for Democracy are using data and technology to safeguard truth and instill trust.

digital illustration of brain exploding

AI for all

It's a recognized problem that social inequity can be reinforced by data-driven systems. As companies, organizations, and governments increasingly use advances in data science and AI to make more efficient automated decisions, biased data can reinforce, operationalize, entrench, and legitimize new forms of discrimination. These automated decision systems have the potential to touch myriad aspects of life — from hiring to the criminal justice system to affordable housing. There is an urgent need to build a shared and nuanced understanding of these issues and to chart a path towards solutions that incorporate technical advances, shifts in business practices, and regulatory mechanisms. 

The Center for Responsible AI (R/AI) is doing this work to make AI work for everyone, and ensure automated decision systems lead to fair and equitable outcomes. The researchers at R/AI catalyze policy and industry impact, foster interdisciplinary research and education, and build an equitable and socially sustainable ecosystem of AI innovation.


Damon McCoy

Assistant Professor of Computer Science and Engineering Damon McCoy focuses on measuring the security and privacy of technology systems and their intersections with society. His Ad Observatory project tracks political spending on Facebook, collecting and organizing data to help journalists and advocates shine a light on the murky world of political advertising. 

Laura Edelson

Laura Edelson

Ph.D. Student Laura Edelson studies online political communication and develops methods to identify misinformation and increase transparency. She is part of the school’s Cybersecurity for Democracy initiative, a research-based effort to expose online threats to our social fabric and recommend countermeasures. Her work has powered reporting in the New York Times, the Wall Street Journal, and the Atlantic.