Gender Bias

The main goal of the project is to empower young girls and bridge the gender gap in computer science and cybersecurity. The project aims to combine philosophy, storytelling, and immersive technologies to inspire girls, develop a growth mindset and critical thinking, provide role models in STEM fields, engage girls in coding and robotics activities, and enhance STEM educators’ skills to address gender bias in educational choices.

Busting Gender Bias in Computer Science and Cyber Security

Karen Maye

Girl toys vs boy toys: The experiment - BBC Stories

BBC Stories

Take the Harvard Implicit Association Test (IAT) 

Since its online introduction in 1998, the Implicit Association Test (IAT) has enabled individuals to uncover hidden biases that may be beyond their conscious awareness—biases that researchers might miss through self-reported data.

The IAT works by having participants categorise onscreen words or images using specific keyboard keys. The response times to different combinations of stimuli are believed to reveal underlying mental associations, even those participants may not be consciously aware of.

The IAT was developed by APS William James Fellow Anthony Greenwald (University of Washington), who collaborated with APS Past President Mahzarin Banaji (Harvard University) and APS Fellow Brian Nosek (University of Virginia) in the mid-1990s. Over the years, this tool has been used to explore unconscious and automatic thought processes in various contexts, including among employers, police officers, jurors, and voters.