meta_pixel
Tapesearch Logo
Log in
Your Undivided Attention

Spotlight — Coded Bias

Your Undivided Attention

Center for Humane Technology

Tristan Harris, Socialjustice, Tech Podcast, Character Ai, Little Tech, Ai History, Silicon Valley, Privacy, Daniel Barcay, Addiction, Ai Addiction, Chat Bots, Children And Tech, Tech Policy, Responsibleai, Tech, New Ai Shows, Screen Time, Open Ai, Elections, Kids Tech, Google, Ai And Kids, Politicsandai, Politics, Anthropic, Dataprivacy, Humans, Tech And Relationships, Us Politics, Ai And Relationships, Aiandhumanrights, Civictech, Aiinsociety, Surveillance, Sam Altman, Technopoly, Humancenteredai, Breakdown Of Trust, Ai And Work, Ai And The Future, Democracy, Futureofwork, Tech Politics, Tech Ethics, Future, Tech Addiction, Asi, Kids Phone Addiction, Best Ai Shows, Ai Regulations, Meta, Digitalgovernance, Bigtech, Ai And Happiness, Machinelearning, Screentime, Relationships, Ai Welfare, Ai Podcast, Cognitive Liberty, Infinite Scroll, Ai And Education, Kids And Ai, Ai Politics, Apple, Digitaldemocracy, Claude, Llms, Societalimpact, Artificial General Intelligence, Agi, Machines, Us Society, Politicaltechnology, Disinformation, Ai And Rights, Elon Musk, Government, Aiaccountability, Polarization, Jon Haidt, Algorithmicbias, Ai Personhood, Kids Online Safety, Superintelligence, Techandsociety, Automation, Design Ethics, News, Time Well Spent, Tech News, Society & Culture, Humane Design, Technology, Cht, Artificial Intelligence, Center For Humane Technology, The Social Dilemma Netflix, Philosophy, Human Downgrading, Aza Raskin, Attention Economy, Ethical Technology

4.81.5K Ratings

🗓️ 8 April 2021

⏱️ 24 minutes

🧾️ Download transcript

Summary

The film Coded Bias follows MIT Media Lab researcher Joy Buolamwini through her investigation of algorithmic discrimination, after she accidentally discovers that facial recognition technologies do not detect darker-skinned faces. Joy is joined on screen by experts in the field, researchers, activists, and involuntary victims of algorithmic injustice. Coded Bias was released on Netflix April 5, 2021, premiered at the Sundance Film Festival last year, and has been called “‘An Inconvenient Truth’ for Big Tech algorithms” by Fast Company magazine. We talk to director Shalini Kantayya about the impetus for the film and how to tackle the threats these challenges pose to civil rights while working towards more humane technology for all.

Transcript

Click on a timestamp to play from that location

0:00.0

Welcome to your undivided attention. Today our guest is Shalani Kuntaya and she is the director of the new film

0:07.0

Coded bias coming out on Netflix on April 5th. We actually originally saw Shalani's film Coded bias

0:14.0

at the same Sundance film festival that the social dilemma premiered at and we're just excited to have her on to talk about her

0:21.0

incredibly important film. Shalani, what is Coded bias about and what has you decided to make this film?

0:27.0

Well first of all thanks so much for having me. It's such an honor to be in conversation around these issues.

0:33.0

Coded bias follows the work of Joy Bellemweeney who's an MIT researcher and she stumbles upon the fact that

0:42.0

facial recognition doesn't see dark faces or women accurately and stumbles down the rabbit hole of

0:50.0

the ways in which algorithms, machine learning, AI is increasingly becoming a gatekeeper of opportunity.

0:59.0

Deciding such important things as who gets a job, who gets what quality of health care, what communities

1:05.0

get undue police scrutiny, sometimes even how long a prison sentence someone may serve.

1:11.0

These same systems that we're trusting so implicitly have not been vetted for

1:16.0

racial bias or for gender bias or that they won't discriminate or have unintended consequences.

1:23.0

And these systems are black boxes that we can't question.

1:27.0

Oftentimes we don't even know when we've been denied an opportunity because of this kind of automated gatekeeping.

1:35.0

And that's when I really realized that we could essentially roll back 50 years of civil rights advances

1:43.0

in the name of these machines being neutral when they're not.

1:47.0

Everything that we love in our democracy is being transformed by AI, fair housing, fair employment,

1:54.0

access to information, so many things. And I think the urgency of that is kind of what inspired me to make the film.

2:01.0

And I really believe this is where civil rights gets fought in the 21st century.

2:06.0

You know, in Silicon Valley, there's this fixation on the singularity.

2:11.0

The place where technology gets better than the things that human beings are best at.

2:15.0

And that's the place we should focus all of our attention, what that misses.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Center for Humane Technology, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Center for Humane Technology and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.