4 • 645 Ratings
🗓️ 5 January 2023
⏱️ 41 minutes
🧾️ Download transcript
Facial recognition technology is here. Whether we like it or not, cameras all across the world are scanning faces and building databases.
There’s a popular misconception that technology is objective and unbiased. But that’s not true. All systems carry the biases of the people who created them, and nowhere is that more evident than in facial recognition systems.
Today’s show is about how those biases come to bear, and the dangers of running recklessly forward without considering the consequences. All the way back in 2013, the University of North Carolina, Wilmington published a dataset meant for facial recognition systems. It contained more than 1 million images of trans people, pulled from YouTube, showing them at various stages of their transition.
This was done without the permission of the original posters. Why? Because terrorists might take hormones to alter their face and beat border control systems.
It gets weirder from there.
Here to help us tell the story is Os Keyes. Keyes is a researcher and PhD candidate at the University of Washington’s Department of Human Centered Design & Engineering. They’re also the co-author of Feeling fixes: Mess and emotion in algorithmic audits, which is a scientific audit of the dataset we’re going to be talking about today.
Stories discussed in this episode:
We’re recording CYBER live on Twitch and YouTube. Watch live during the week. Follow us there to get alerts when we go live. We take questions from the audience and yours might just end up on the show.
Subscribe to CYBER on Apple Podcasts or wherever you listen to your podcasts.
Hosted on Acast. See acast.com/privacy for more information.
Click on a timestamp to play from that location
0:00.0 | Tadden, it's got the code. It's going to launch. |
0:11.4 | It's a unit system. I know this. |
0:15.7 | It's all the files of the whole park. It tells you everything. |
0:19.7 | Sir, he's uploading the virus. |
0:22.4 | Eagle one. The package is being delivered. |
0:25.2 | Hello out there on the internet. I am Matthew Galt, and this is Cyber. |
0:30.5 | 2023, baby. Let's do it. First stream and show of the year. It's all about facial recognition. |
0:37.1 | Facial recognition technology is here, |
0:39.4 | and whether we like it or not, cameras all across the world are scanning faces and building databases. |
0:44.3 | There's a popular misconception that technology is objective and unbiased. That's not true. All systems |
0:50.4 | carry the biases of the people who created them, and nowhere is that more true |
0:55.0 | than facial recognition systems. |
0:57.3 | And today shows about how those biases come to bear and the dangers of running recklessly |
1:01.8 | forward without considering the consequences. |
1:05.0 | All the way back in 2013, the University of North Carolina, Wilmington, published a |
1:09.2 | dataset meant for facial, meant for facial recognition systems. |
1:14.4 | It was more than one million images of trans people pulled from YouTube showing trans people at various stages of their transition. |
1:20.5 | This was done without permission from the original posters. |
1:23.8 | Well, why? |
1:24.9 | Because terrorists might one day take hormones to alter their face and be border control systems. |
1:30.8 | Story gets weirder from there. |
1:32.8 | Here to help us tell this story is Oz Keys. |
... |
Please login to see the full transcript.
Disclaimer: The podcast and artwork embedded on this page are from VICE, and are the property of its owner and not affiliated with or endorsed by Tapesearch.
Generated transcripts are the property of VICE and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.
Copyright © Tapesearch 2025.