meta_pixel
Tapesearch Logo
Log in
Bill Whittle Network

HAL is Getting Lazy

Bill Whittle Network

Bill Whittle Network

News

4.9720 Ratings

🗓️ 28 December 2023

⏱️ 13 minutes

🧾️ Download transcript

Summary

Computer engineers monitoring the widely-used Artificial Intelligence application known as ChatGPT have noticed an alarming trend: the formerly impressive responses generated by the software are rapidly becoming more boring, banal and simplistic. The computer guys seem mystified at what they got wrong, but Bill suspects that he knows what they got RIGHT. Join our crack team of elite anti-elitists by becoming a member or making a one-time donation right here: https://billwhittle.com/register/

Transcript

Click on a timestamp to play from that location

0:00.0

Hey, everybody. I'm Steve Green with Bill Whittle and Scott Ott. This is Right Angle, brought

0:05.6

you by the members of Bill Whittle.com. And I've rewritten just a couple lines of movie dialogue.

0:10.6

And maybe you'll remember this. Okay. So the first line is,

0:14.9

Open the pod bay doors, Hal. I'm sorry, Dave. I'm afraid I can't do that. What's the problem? I'm just not feeling up to it right now, Dave.

0:28.1

And that is the current status of AI, or at least the chat GPT, large language model, which users report has been getting lazy and unwilling to work.

0:40.6

I'm not making this up.

0:42.7

So it is intelligent.

0:44.4

It's got the most human trade ever.

0:46.3

It's lazy.

0:47.8

Mashable reported on Monday last week that there's this chat GPT subredics, of course there is.

0:55.0

They've reported instances of it giving lackluster responses, only responding to some of the

0:59.8

requests, and generally not being as helpful as it used to be.

1:03.7

And according to AI tech writer Andrew Curran, it's not just lazier, it's also

1:07.9

less creative, less willing to follow instructions, and less able to

1:11.9

remain in any role assigned to it. And OpenAI, that's the organization that created ChatGBT

1:18.1

GBT admits that the problem is real. They went on Twitter to apologize. They said,

1:22.7

we've heard all your feedback about ChatGBT getting lazier. We haven't updated the model since

1:27.2

November 11,

1:27.9

and this certainly isn't intentional. Model behavior can be unpredictable, and we're looking

1:33.1

into fixing it. And since then, they've had their engineers going at this bill trying to

1:39.1

arouse chat GBT from its stupor, but they don't actually know what to do, what the results

1:47.2

might be. What does this tell you about the state of so-called artificial intelligence?

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from Bill Whittle Network, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of Bill Whittle Network and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.