meta_pixel
Tapesearch Logo
Log in
Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

156 | Catherine D’Ignazio on Data, Objectivity, and Bias

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

Sean Carroll | Wondery

Society & Culture, Physics, Philosophy, Science, Ideas, Society

4.84.4K Ratings

🗓️ 19 July 2021

⏱️ 88 minutes

🧾️ Download transcript

Summary

How can data be biased? Isn’t it supposed to be an objective reflection of the real world? We all know that these are somewhat naive rhetorical questions, since data can easily inherit bias from the people who collect and analyze it, just as an algorithm can make biased suggestions if it’s trained on biased datasets. A better question is, how do biases creep in, and what can we do about them? Catherine D’Ignazio is an MIT professor who has studied how biases creep into our data and algorithms, and even into the expression of values that purport to protect objective analysis. We discuss examples of these processes and how to use data to make things better.

Support Mindscape on Patreon.

Catherine D’Ignazio received a Master of Fine Arts from Maine College of Art and a Master of Science in Media Arts and Sciences from the MIT Media Lab. She is currently an assistant professor of Urban Science and Planning and Director of the Data+Feminism Lab at MIT. She is the co-author, with Lauren F. Klein, of the book Data Feminism.


See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript

Click on a timestamp to play from that location

0:00.0

Hello everyone, welcome to the Mindscape podcast. I'm your host Sean Carroll. Everyone knows, I think,

0:06.3

that even though words like data and algorithm carry a certain patina of objectivity with them,

0:14.1

in the real world, it's often the case that neither the collection of data, nor the analysis of data,

0:19.5

nor the use of algorithms are completely objective. They have biases built into them because all

0:25.5

of these facts about the world or ideas about how data should be analyzed are created by human

0:31.2

beings and human beings have their foibles, right? And we see this in action in ways both sort of

0:36.7

profound and trivial. There are algorithms that decide who people should hire, who should be

0:43.8

suspected of committing crimes. Something we'll talk about in this podcast is crash test dummies.

0:50.4

When car crashes are done by car companies to test them for safety, it used to be that all of the

0:56.4

crash test dummies were modeled after men. None of them were in the shape or size of women,

1:01.9

and as a result, you could actually figure out exposed facto, the designs of seat belts and

1:08.0

things like that for cars were noticeably less effective for women than for men. So as objective

1:15.3

as we might try to be, we're going to fall a little bit short. Think of it this way. This is

1:20.1

one of the ways I like to think about it. You're standing somewhere right now, you're in a room

1:24.0

or you're outside or in your car, look around and imagine trying to describe your immediate

1:29.3

environment to somebody else in a completely objective way. You can imagine doing that, maybe you

1:35.2

think you can do that, but the fact is you can't. You can say objectively true things, right? There

1:40.5

are true things to say about the world. You're in a car, it's a Toyota, whatever it is, but you're

1:46.9

making choices along the way. There are an infinite number of things you could say that are objectively

1:53.6

true, but it's you who are always going to be a little bit fallible and have your biases, have

2:00.2

your history and your interests and so forth that choose for you what features of the environment

2:08.0

matter, right? How to divide up the environment into the interesting facts, the uninteresting facts,

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Sean Carroll | Wondery, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Sean Carroll | Wondery and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.