meta_pixel
Tapesearch Logo
Log in
Azeem Azhar's Exponential View

Azeem’s Picks: AI, Accountability, and Power with Meredith Whittaker

Azeem Azhar's Exponential View

EPIIPLUS 1 Ltd / Azeem Azhar

Openai, Intelligence, It, Society, Technology, Review, Ai, Investing, Science, Economy, Business, Artificial Intelligence, Automation, Robots, Exponential, Future, Tech News, Work, Government, Exponential View, Economics, News, Gpt, Azeem Azhar

51.1K Ratings

🗓️ 8 November 2023

⏱️ 33 minutes

🧾️ Download transcript

Summary

Artificial Intelligence (AI) is on every business leader’s agenda. How do you ensure the AI systems you deploy are harmless and trustworthy? This month, Azeem picks some of his favorite conversations with leading AI safety experts to help you break through the noise. Today’s pick is Azeem’s conversation with Meredith Whittaker, president of the Signal Foundation. Meredith is a co-founder and chief advisor of the AI Now Institute, an independent research group looking at the social impact of artificial intelligence.

Transcript

Click on a timestamp to play from that location

0:00.0

Asana is the only work management platform built for scale.

0:05.0

AI has joined the team to maximize impact by automating workflows.

0:10.0

Asana, a smarter way to work. Learn more at Asana.com. Hi there, it's Azimazar, founder of Exponential View.

0:27.0

We are moving into an age of artificial intelligence.

0:30.0

These tools of productivity, efficiency, and creativity are coming on in leaps and bounds,

0:35.0

even if they remain incomplete and immature today.

0:40.0

Implementations of AI are becoming priorities amongst top execs and the largest firms all over the world.

0:46.2

Now one big question is how do you make sure your AI systems behave ethically and fairly? It's a huge issue and it's one I've been exploring since 2015 in my

0:56.1

newsletter exponential view. And over the years I've hosted some of the leading experts on this

1:01.8

subject on this very podcast. I know that ethical

1:05.8

AI implementation is top of mind for leaders like you so to help you think through

1:10.2

the questions of responsibility, accountability and power in the context of AI development,

1:15.5

I'm bringing back some of my previous conversations over the next five weeks.

1:21.1

This week, I want to bring back my 2019 conversation with Meredith Whitaker, President of the Signal Foundation.

1:27.0

Meredith is a co-founder and chief advisor of the AI Now Institute, an independent research group looking at the social impact of artificial intelligence.

1:37.0

Previously, she worked at Google and was one of the instigators of the huge 2018 Google Walkout when 20,000 staff protested against a culture of harassment and discrimination at the company.

1:49.0

In this conversation we reflect briefly on that event and then go deeper into how discrimination

1:54.0

creeps into AI systems and what organizations can do about it. We cover history,

1:58.9

power theory and accountability practices of AI implementation.

2:03.0

Here's my conversation with Meredith.

2:05.0

Meredith, welcome to the Exponential View podcast.

2:07.0

Thank you so much. I'm happy to be here.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from EPIIPLUS 1 Ltd / Azeem Azhar, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of EPIIPLUS 1 Ltd / Azeem Azhar and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.