meta_pixel
Tapesearch Logo
Log in
What Next | Daily News and Analysis

TBD | When A.I. Denies Your Health Care

What Next | Daily News and Analysis

Slate

News, Daily News, News Commentary, Politics

4.62.3K Ratings

🗓️ 31 March 2023

⏱️ 29 minutes

🧾️ Download transcript

Summary

As Medicare Advantage plans have increased their reliance on software to determine what their customers require—and, therefore, receive—elderly patients are being denied coverage for care they need. What happens when an algorithm — not a doctor — decides how much care you need and it’s not enough? Guest: Casey Ross, national technology correspondent at STAT Host: Emily Peck If you enjoy this show, please consider signing up for Slate Plus. Slate Plus members get benefits like zero ads on any Slate podcast, bonus episodes of shows like Slow Burn and Dear Prudence—and you’ll be supporting the work we do here on What Next TBD. Sign up now at slate.com/whatnextplus to help support our work. Podcast production by Evan Campbell Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript

Click on a timestamp to play from that location

0:00.0

Why don't you tell me the story of Francis Walter? What happened to her?

0:08.4

Francis Walter had suffered an injury. She had fallen and broken her shoulder.

0:15.1

That's Casey Ross, National Technology Correspondent at Staten.

0:22.6

She had an acute hospital stay. So she stayed in the hospital for some period of time and

0:29.7

had a surgery to repair her shoulder and then was discharged to a nursing home.

0:36.3

And once she got into the nursing home, an algorithm was run about her care to suggest how long

0:44.9

she ought to be in the nursing home. This algorithm, a hidden aspect of Medicare Advantage Insurance

0:52.1

Plans, looked through Francis's past health records and considered how long other patients typically

0:58.3

need care when recovering from this kind of injury. It determined she only needed 16.6 days to recover.

1:06.0

And on the 17th day, she gets a notice from her insurer, security health plan of Wisconsin,

1:14.8

that she no longer meets Medicare coverage criteria to continue to stay in the nursing home.

1:22.4

She doesn't know anything about this algorithm and neither does her family.

1:27.0

Francis had a number of other health complications. And despite with the algorithm determined,

1:32.8

17 days just was not enough time. At that point, she can't dress herself. She can't

1:39.9

push her walker without assistance. And she can't carry on sort of basic activities of daily living.

1:47.8

And you have to remember that this is a person who's 86 years old. You don't just bounce back

1:53.1

from an injury like that. You need to have some amount of rehabilitation care so that you can

2:01.0

get to the point where you can again live independently. More and more, insurance companies are

2:07.0

relying on algorithms like this to make life and death decisions for patients. What's meant to be

2:12.8

a reference point to estimate the level of care a person might need is increasingly being taken as

2:18.6

facts. You'd expect that an insurance company would be carefully reviewing the details of her care

2:26.0

and making a decision based on that, which is the law, by the way. That's, you know, you're supposed

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Slate, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Slate and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.