danae

Danaë Metaxa

PhD Candidate, Computer Science, Stanford University



Updates

11/2020 - I'll be teaching CS347 (Human-Computer Interaction Research) this coming Spring term, with Parastoo Abtahi.

4/2020 - Our project on finsta accounts won a Best Paper Honorable Mention at CHI20!

I’m on the academic job market this year!

I’m a sixth year Ph.D. candidate in Computer Science at Stanford University in the Human-Computer Interaction Group, co-advised by James Landay (Computer Science) and Jeff Hancock (Communication). In my research, I develop and deploy methods for studying bias and representation in algorithms and algorithmic content, focusing on high-stakes social settings like politics and employment, and on the experiences of maginalized people.

Projects I’ve worked on include gender and race representation in search algorithms (in submission), stereotypes and inclusivity in web interfaces, the role of search media in elections, and social capital during disaster events.

In addition to academic publications, I’ve written for a general audience on topics like political bias in search results in The Guardian, and social media sites and democracy in Wired.

Before my PhD, I graduated with dual degrees in Computer Science alongside Science, Technology, and Society at Brown University in 2015.

P.S. My first name is pronounced like the verb “deny”.


Research Highlights

I have several ongoing threads of research, mainly focusing on bias and representation in algorithmic content, using a combination of computational and behavioral social science methods.

Gender and Race Representation in Image Search Results

Search results for the query 'author' with most images of people of color.

An Image of Society: Gender and Racial Representation and Impact in Image Search Results for Occupations
Danaë Metaxa, Michelle Gan, Su Goh, James Landay, and Jeff Hancock. In submission (R&R) to CSCW21

Visual diversity has been the subject of studies in domains like psychology and advertising. But unlike the purposeful persuasive intent behind advertising, algorithmic content like search engine results are compiled automatically and spontaneously in response to user queries. Regardless of intent, the impact on users—say, a young person of color looking for information about their desired career and finding a sea of white faces—may still be substantial. Do image search results accurately reflect real-world gender and racial diversity? How does visual diversity influence users?

In this project, currently in submission, we conducted an audit examining the results of Google Image queries for fifty common occupations, found that women and people of color were underrepresented relative to men and whites, and that the degree of this underrepresentation was not reflective of workforce participation. We then conducted a randomized controlled study exposing participants to a search results varying degrees of gender and racial diversity, finding that participants perceived occupations to be more inclusive when search results showed more women or people of color, and that participants’ interest in joining an occupation was greater when more people of color were represented. However, increasing the proportion of women actually decreased participant interest in some cases (perhaps an effect of perceived occupational feminization). We also examined the influence of participants’ own identities on their experience of image search results, finding that marginalized identity mediated participants’ expectation of being valued (e.g., greater representation of women was received positively by women participants but in some cases had a negative impact on men). Designing technology for inclusivity and belonging requires satisfying a complex and sometimes contradictory set of constraints; there is no silver bullet solution to make algorithms “fair” for all.

Inclusive Web Design

Two versions of an introductory course webpage

Gender-Inclusive Design: Belonging and Bias in Web Interfaces
Danaë Metaxa, Kelly Wang, James Landay, and Jeff Hancock. (ACM CHI 2018)

Psychology theory suggests that people’s ambient environments can cue stereotypes and influence their sense of belonging. Do digital spaces also impact self perception and choices? To answer this question, we ran a randomized controlled experiment to investigate, designing two different versions of a computer science course webpage altering only the aesthetics of the page but not its content. College-aged participants were either exposed to a course page with a neutral theme (i.e., images of trees, standard sans serif fonts) or one designed to evoke stereotypical ideas of computer science (i.e., star trek imagery, green text on a black background resembling a computer console). We found that, while men showed little preference for either website, women were negatively impacted by the stereotypical interface—they were less likely to feel they belonged in the course, less optimistic about their future performance, less interested in taking the course, and less interested in studying computer science at all. On the whole, women were 20% less likely to want to enroll in the course, a deterring effect of about twice that on men. This work uses gender bias as a case study supporting literature from psychology and translating it to a digital context; biases in online content can significantly impact users’ psychological sense of belonging, beliefs about themselves, and expected future behaviors.

Algorithm Audits: Past, Present, and Best Practices

Manuscript in preparation in collaboration with Joon Sung Park, Ronald E. Robertson, Karrie Karahalios, Christo Wilson, and Christian Sandvig.

Conducting a rigorous and effective algorithm audit like those I often deploy in my work entails legal and ethical challenges, as well as technical ones. I am currently leading a collaboration between leading algorithm audit researchers at Stanford, the University of Illinois at Urbana-Champaign, Northeastern University, and the University of Michigan to produce a journal article explaining the intellectual and scientific contributions of this important and versatile method, along with guidelines and best practices—technical, legal, and ethical—for conducting successful audit studies.


Other Recent Publications

For a complete list of my academic publications, see my Google Scholar page.

Random, Messy, Funny, Raw: Finstas as Intimate Reconfigurations of Social Media
Best Paper Honorable Mention
Sijia Xiao, Danaë Metaxa, Joon Sung Park, Karrie Karahalios, and Niloufar Salehi (ACM CHI 2020)

Search Media and Elections: A Longitudinal Investigation of Political Search Results in the 2018 U.S. Elections
Danaë Metaxa, Joon Sung Park, James Landay, and Jeff Hancock (ACM CSCW 2019)

Glasnost! Nine ways Facebook can make itself a better forum for free speech and democracy
Timothy Garton Ash, Robert Gorwa, and Danaë Metaxa (Reuters Institute for the Study of Journalism, Oxford)