AI Intersections Database

Filters

Clear All

Share Results

Showing 8 out of 233 results

Sort by

Recently Added
Issue 2017

AI systems reflect the culture's bias against the disabled.

The Allegheny County Department of Human Services in the state of Pennsylvania in the United States uses an AI system that residents allege incorrectly flags disabled parents as being neglectful, removing their children from their homes with no actual evidence of neglect. It is currently under investigation by the United States Department of Justice.

Community Health and Collective Security Disability Justice Human Rights
Issue 2023

Medicare Advantage insurance plans use AI to determine what care it will cover for its 31 million elderly subscribers in the United States.

Journalists at Stat found that companies are specifically using AI systems to deny coverage for care. The massive problem: the algorithms are a black box that can’t be peered into, making it nearly impossible for patients to fight for health care coverage when they don’t know why they were denied it in the first place.

Community Health and Collective Security Disability Justice Economic Justice
Issue 2024

AI hiring algorithms come complete with dangerous bias.

About 70 percent of companies (and 99 percent of Fortune 500 companies) around the world use AI-powered software to make hiring decisions and track employee productivity. The problem? The tools work by identifying and replicating patterns around who was previously hired, which means they perpetuate the bias embedded in the system, locking marginalized populations out of employment. This is particularly tough for disabled people, people of color, and disabled people of color, who are often subject to employment discrimination.

Disability Justice Economic Justice
Issue 2023

AI could be used to better meet the needs of the disabled, but there are currently many instances where it actively works against the disabled community.

In 2023, researchers at Pennsylvania State University published “Automated Ableism: An Exploration of Explicit Disability Biases in Artificial Intelligence as a Service (AIaaS) Sentiment and Toxicity Analysis Models,” which explores the bias embedded in several natural language processing (NLP) algorithms and models. They found that every single public model they tested “exhibited significant bias against disability,” classifying sentences as negative and toxic simply because they contained references to disability, ignoring context and the actual lived experiences of disabled people.

Community Health and Collective Security Disability Justice
Actor

Lisa Gutermuth

Lisa Gutermuth is a senior program officer with the Data Futures Lab at Mozilla Foundation, working to shift power in the data economy. Lisa is also the Sustainability Lead at Mozilla Foundation, working to integrate climate and environmental justice into our broader work. Also in her role, she is a Mozilla Foundation's representative member of the Green Screen Coalition, a funder collaborative aiming to be a catalyst in making visible the climate implications of technology by supporting emerging on-the-ground work, building networks, and embedding the issue as an area within philanthropy.

Community Health and Collective Security Economic Justice Environmental Justice Human Rights
Actor

Green Screen Coalition

The Green Screen Climate Justice and Digital Rights Coalition is a group of funders and practitioners looking to build bridges across the digital rights and climate justice movements. The aim of the coalition is to be a catalyst in making visible the climate implications of technology by supporting emerging on-the-ground work, building networks, and embedding the issue as an area within philanthropy.

Environmental Justice
Actor

Hailey Froese

Hailey Froese is a Program Officer at the Mozilla Foundation, working to collaboratively design and support networks and community. She has a background in program management, engagement and facilitation, and has contributed to several global social justice organizations. She believes in the power of community and is passionate about ensuring the internet is a safe and open place for all.

Community Health and Collective Security Human Rights
Actor

Hanan Elmasu

Hanan Elmasu leads the Fellowships and Awards program at the Mozilla Foundation. She manages a global program that finds, supports, and connects individuals and organizations who are building a more open, inclusive internet and more trustworthy AI. She has been working at the intersection of human rights, law, and technology for over three decades, focused on building the strength of communities and exploring the potential of data and technology to empower movements. She has particular expertise around the intersection of technology and human rights, the SWANA region, and international human rights accountability mechanisms.

Community Health and Collective Security Economic Justice
Load More Results

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.