Equally
Equally checks for implicit biases in text addressing SDGs 10 and 16 to create a fairer world free of discrimination.
In recent months, peace and justice issues have frequented the globe. For centuries, systemic inequalities have affected the lives of billions. However, most discrimination isn't intentional—it's a byproduct of a society that's been engrained with racism. Many people don't realize the implicit biases they possess, which, nonetheless, cause harm.
We aim to change this with our solution, Equally, a software to detect for implicit biases using artificial intelligence. We recognize equality will not be achieved in a few days; combating these long-held notions will require patience. But by repeatedly using our product, our customers will gradually eradicate their inherent biases and alter their overall outlook on different groups of people. Through Equally, we hope to begin the long subdued work of tackling inequality - one word at a time.
In 1998, results from the Implicit Association Test demonstrated that 68% of participants were faster at associating good words with a face with a white profile in comparison to a face with a black profile which indicated that implicit biases are present at the subconscious level. Equally’s primary purpose is to eliminate the widespread lack of awareness that people have towards the implicit biases that they possess, and to fill the need for an effective and consistent tool to measure implicit bias in text-based communication. Everyone across socioeconomic status, gender, and ethnic identity is affected by the problem of implicit bias, as deeply ingrained stereotypes and false generalizations continue to persist around people of all backgrounds. The absence of comprehensive education on implicit biases and ineffective solutions continue to contribute to preventing this problem from being fully addressed. By being able to identify, continuously monitor, and correct people’s implicit biases in written form, Equally can mitigate discrimination and minimize negative perceptions of different groups of people.
Our target group is — everyone. Everybody has implicit, underlying biases that are difficult to recognize on their own. This problem is especially pertinent for educators, law enforcement, managers, and other public-facing roles since these people have the greatest need for language that is fair and equal. Educators' language, specifically, has a lasting impact on the formative minds they teach. By using Equally, we hope that users can begin to understand the implicit biases they have. We hope that Equally can mitigate discrimination caused by writing as a first step for people to begin mitigating discrimination in other forms such as when speaking.
Through the comment section on our media coverage by the University of Pennsylvania Wharton School Future of the Business World Podcast, we were able to hear the experiences of implicit bias that various other high schoolers from around the world had experienced. For example, one girl from Korea mentioned that, "During my sophomore year, the final math exams had been relatively difficult...I had gotten a satisfactory mark, and when my friends asked me, I answered truthfully 'An A+'...a voice called out from behind me, 'well, she's Korean', as if that explained everything. From then, I heard this remark frequently in many of my classes. Classmates would recognize my achievements in school as something that I inherited as an Asian, and never seemed to give a thought about the effort that was put in to achieve what I did." Yet another student stated that "I remember sitting in my ninth-grade history class where students would point at me and say 'You eat dog' and my teacher would project caricatures of Asian men onto the whiteboard to represent traditional Chinese art." These comments from students from around the world were especially impactful for us to gain an even better understanding of how other high schoolers have been experiencing implicit bias, and make us further realize the prevalence of implicit bias and urgency to tackle racial inequities globally.
- Other: Addressing an unmet social, environmental, or economic need not covered in the four dimensions above
By addressing oft-overlooked stereotypes and facilitating the education of such ideas, our solution addresses supporting a reality where such biases are nonexistent. Our solution aligns with the United Nation's Sustainable Development Goals 10 (reduced inequalities) and 16 (peace, justice, and strong institutions). While combating these long-held notions will require patience, by repeatedly using our product, we hope that our customers will be better educated on the implicit biases they inherently possess. Ultimately, we hope that this will help users to gradually reduce their inherent biases, to alter their overall outlook on different groups of people in other everyday life situations as well.
- Concept: An idea being explored for its feasibility to build a product, service, or business model based on that idea
We have developed the Equally concept, identifying the challenge of bias in text, validating the problem through research and conversations with Project Implicit representatives, and taking into account our target market. We've also spoken with NLP professors, a Microsoft researcher, and a Grammarly engineer to shape the Equally concept to be as feasible and impactful as possible. Furthermore, we created our website, Equally.world, as a platform to share the Equally concept and raise awareness about bias in text.
- A new use of an existing technology (e.g. application to a new problem or in a new location)
Only two solutions are currently present to combat implicit biases: training courses and tests. However, in both situations implicit biases are only tested in a short period of time; none can check for biases present in your daily life—the ones that are the most important to unlearn.
Natural language processing (NLP) is a subfield of AI concerned with deciphering human languages. This technology has begun to be applied to various fields: checking grammar, measuring positivity of product reviews, detecting cyberbullying and misinformation. However, it has not yet been used to detect for implicit biases. Our solution, Equally, would use an existing technology, NLP, to correct and educate users about biases in their writing and speech whenever they enable our program. With just the touch of a click, Equally will help people recognize their subconscious prejudices to ultimately create a world of peace and justice.
- Artificial Intelligence / Machine Learning
- Behavioral Technology
- Software and Mobile Applications
- United States
The software for Equally is currently at the concept phase. However, we have been able to impact people by creating a website to educate more people on implicit bias. The website that we created, has attracted unique visitors from 9 different countries: United States, Canada, China, Nigeria, Germany, Finland, Ireland, India, and Ukraine. We hope that by seeing our website, we can encourage more people to think about ways that they may have been affected in the past by others' implicit biases and implicit biases that they may have themselves. For next year, we hope to reach and educate 1000+ visitors through our website and also begin developing our idea so we can begin to conduct a pilot study for 100+ users to test our software.
Our impact goals for next year are to educate more people about implicit bias through our website and to develop Equally and conduct a pilot with a prototype. We will do this by adding more educational information on implicit bias on our website and we plan to promote our website by using social media platforms such as Instagram and LinkedIn. To achieve the second part of our goal, we plan to continue to research about AI algorithms and reach out to experts, to ultimately create a prototype for people to use. We then plan to use the social media channels, website visitors, and other methods to find users to test our first prototype and give feedback.
We hope to measure our progress by reaching 1000+ visitors through our website. For our pilot program, we hope to test a prototype of Equally with 100+ users. We also plan to create an Instagram and LinkedIn account to promote our website and to gain potential users for our prototype.
The current development barriers for Equally include the data collection required to train Equally's algorithms presents technical and financial challenges, as such data would have to be gathered from scratch in order to avoid bias in the training. The lack of pre-existing algorithms that target the identification of implicit bias in text, especially as bias is difficult even for humans to identify, pose another barrier to develop a model for Equally. Furthermore, cultural barriers to Equally's development exist, as we would need to address the possibility that Equally's goal, targeting one's implicit bias, may offend those who need to recognize their implicit bias the most. Though we all have implicit bias, many people refuse to acknowledge it, which could limit Equally's impact.
Our team consists of 3 high-school-aged women of color: Sora, Moniola, and Sualeha. Growing up in America, we've witnessed firsthand the racially charged biases that have exacerbated inequalities for minority groups in the country. The rise in violence against communities we identify with, the Asian and Black communities, has proven that unless the issue of prejudice is resolved, we, along with our communities, will never be truly safe. Our exposure to everyday implicit bias has helped fuel our passion for Equally to prevent and alleviate racial inequity. Furthermore, our exposure to the recent rise in online social activism and increased media coverage of racial injustice has enabled us to internalize the urgent need for a deployable solution now more than ever.
Along with researching the adverse effects of implicit bias in society, our team has previous experience in developing creative and deployable solutions through a 10 month social entrepreneurship program for driven teens: The Knowledge Society. Through this program, we've developed vital skills for running a successful team, such as effective project management, open communication, and consistency in achieving weekly progress. We have the drive to help solve racial inequity and make a meaningful impact through innovative and implementable ideas. We've been exposed to how technology can be leveraged as a tool to solve challenging issues. This exposure in combination with our ambition to solve the world's most challenging problems fuels our desire to attain the necessary skills and resources to build out Equally.
While some may say that high-schoolers are too young, we believe that with our first-hand experience with racial inequity, urgency to eliminate discrimination as the rising generation of youth, and determination to solve racial inequity through leveraging technology, we are uniquely qualified to tackle this issue.
Equally has received validation from those of experience across multiple scopes. An executive at Moody's Foundation, a charitable organization, stated: "I am blown away! I want to be on your test group when you have a prototype. [Equally] is a perfect product to raise funds through crowdfunding. Bravo!" Peace First, an organization that aids youth-led projects, also demonstrated its support: "Your project sounds very strong and seems like a relevant, timely resource for many different sectors to assess their bias”. Finally, we consulted researchers at Project Implicit, an organization founded by scientists at Harvard, University of Virginia, and Washington University, that has used scientific research to inform the public about bias for decades, as well as a Dartmouth College professor in AI.
- No
N/A
- Yes
Equally is led by three high school women of color who are committed to mitigating racial inequity. With first-hand experience of racial discrimination, we saw an urgent need for solutions which address this problem and hope that Equally can be the first step. However, we hope to eventually expand our concept to combat all forms of discrimination including gender inequity. The Pozen Social Innovation Prize would help us develop our solution to detect not only implicit biases related to race but also gender.