meta_pixel
Tapesearch Logo
Log in
Machine Learning Guide

MLG 023 Deep NLP 2

Machine Learning Guide

OCDevel

Artificial, Introduction, Learning, Courses, Technology, Ml, Intelligence, Ai, Machine, Education

4.9848 Ratings

🗓️ 20 August 2017

⏱️ 43 minutes

🧾️ Download transcript

Summary

Support my new podcast: Lefnire's Life Hacks

RNN review, bi-directional RNNs, LSTM & GRU cells. ocdevel.com/mlg/23 for notes and resources

Transcript

Click on a timestamp to play from that location

0:00.0

Welcome back to Machine Learning Guide. I'm your host, Tyler Rinelli.

0:05.0

MLG teaches the fundamentals of machine learning and artificial intelligence.

0:09.0

It covers intuition, models, math, languages, frameworks, and more.

0:13.0

Where your other machine learning resources provide the trees, I provide the forest.

0:18.0

Visual is the best primary learning modality, but audio is a great supplement during exercise commute and chores.

0:25.6

Consider MLG your syllabus with highly curated resources for each episode's details at OCdevel.com forward slash MLG.

0:35.6

I'm also starting a new podcast which could use your support.

0:39.6

It's called Lefnear's Life Hacks and teaches productivity focused tips and tricks,

0:44.0

some which could prove beneficial in your machine learning education journey.

0:48.7

Find that at Ocdevel.com forward slash LLH.

0:53.8

This is episode 23, Deep NLP Part 2.

0:57.5

In this episode, I will finally wrap up on NLP.

1:00.6

This is the second episode of Deep NLP, RNNs.

1:04.1

And we're going to cover a few fine points about RNNs.

1:07.7

We'll talk about bidirectional RNNs, the vanishing and exploding gradient problem

1:13.1

of back propagation and its solution through LSTMs or GRUs. But let's start with a review of

1:20.9

RNNs because I think that last episode went a little bit fast and so I want to make sure that you

1:26.5

understand RNNs completely

1:28.2

before we move on. So in deep learning, we use neural networks, and there are various flavors of

1:34.6

neural networks for different tasks. The vanilla neural network, also called a multi-layer perceptron

1:40.8

or a feed-forward network, or usually just a neural network.

1:44.9

Is used for sort of general tasks, general classification or regression problems.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from OCDevel, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of OCDevel and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.