meta_pixel
Tapesearch Logo
Log in
TED Talks Daily

When AI can fake reality, who can you trust? | Sam Gregory

TED Talks Daily

TED

Creativity, Ted Podcast, Ted Talks Daily, Business, Design, Inspiration, Society & Culture, Science, Technology, Education, Tech Demo, Ted Talks, Ted, Entertainment, Tedtalks

4.1 β€’ 11.9K Ratings

πŸ—“οΈ 20 December 2023

⏱️ 13 minutes

🧾️ Download transcript

Summary

We're fast approaching a world where widespread, hyper-realistic deepfakes lead us to dismiss reality, says technologist and human rights advocate Sam Gregory. What happens to democracy when we can't trust what we see? Learn three key steps to protecting our ability to distinguish human from synthetic β€” and why fortifying our perception of truth is crucial to our AI-infused future.

Transcript

Click on a timestamp to play from that location

0:00.0

Ted Audio Collective. Talk Daily, I'm your host, Elise Hugh.

0:12.7

The deceptive images and fake footage created by AI are getting realer and realer.

0:18.8

It makes the truth more difficult to verify.

0:23.0

That's the work of technologist Sam Gregory.

0:26.0

In his 2023 talk at Ted Democracy,

0:28.0

he shares how his task force is taking on the challenge

0:31.0

of protecting the truth while detecting the fakes after the break.

0:36.0

It's getting harder, isn't it, to spot real from fake AI generated.

0:43.0

With generative AI along with other advances in deep fakry,

0:48.0

it doesn't take many seconds of your voice,

0:50.0

many images of your face to fake you and the realism keeps increasing.

0:55.0

I first started working on deep fakes in 2017

0:59.0

when the threat to our trust in information was overhiked and the big harm in reality was falsified

1:05.3

sexual images. Now that problem keeps growing, harming women and girls worldwide,

1:10.7

but also with advances in generative AI, we're now also

1:16.0

approaching a world where it's broadly easier to make fake reality, but also

1:21.1

to dismiss reality as possibly faked.

1:25.0

Now, deceptive and malicious audiovisual AI

1:28.4

is not the root of our societal problems.

1:31.1

But it's likely to contribute to them.

1:33.4

Audio clones are proliferating in a range of electoral contexts.

1:37.7

Is it, isn't it claims?

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from TED, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of TED and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright Β© Tapesearch 2025.