meta_pixel
Tapesearch Logo
Log in
Machine Learning Guide

MLG 012 Shallow Algos 1

Machine Learning Guide

OCDevel

Artificial, Introduction, Learning, Courses, Technology, Ml, Intelligence, Ai, Machine, Education

4.9848 Ratings

🗓️ 19 March 2017

⏱️ 53 minutes

🧾️ Download transcript

Summary

Support my new podcast: Lefnire's Life Hacks

Speed-run of some shallow algorithms: K Nearest Neighbors (KNN); K-means; Apriori; PCA; Decision Trees

ocdevel.com/mlg/12 for notes and resources

Transcript

Click on a timestamp to play from that location

0:00.0

Welcome back to Machine Learning Guide. I'm your host, Tyler Rinelli.

0:05.0

MLG teaches the fundamentals of machine learning and artificial intelligence.

0:09.0

It covers intuition, models, math, languages, frameworks, and more.

0:13.0

Where your other machine learning resources provide the trees, I provide the forest.

0:18.0

Visual is the best primary learning modality, but audio is a great supplement during exercise commute and chores.

0:25.6

Consider MLG your syllabus with highly curated resources for each episode's details at OCdevel.com forward slash MLG.

0:35.6

I'm also starting a new podcast which could use your support. It's called

0:39.9

Lefnear's Life Hacks and teaches productivity focused tips and tricks, some which could prove

0:45.5

beneficial in your machine learning education journey. Find that at Ocdevel.com forward slash

0:51.9

LLH. This is episode 12, shallow learning algorithms part one.

0:57.9

In this episode, I'm going to discuss various shallow learning algorithms, shallow learning.

1:03.4

So remember in a previous episode when we talked about deep learning, we talked about using neural networks, which is kind of the quintessential deep learning concept,

1:12.7

as sort of a silver bullet approach.

1:15.4

You can use neural networks for classification, for regression.

1:19.3

You can use them for linear situations where the features don't need to combine,

1:23.2

and you can use them for non-linear situations where features may need to combine in some special way.

1:28.7

Maybe the system will learn x3 squared plus x2 times x1.

1:33.5

So that's feature learning.

1:34.9

And the other piece of deep learning is the hierarchical representation of the data it's learning.

1:39.7

So breaking down a face into its subparts, Eyes becomes lines and angles and those become pixels.

1:45.8

But I also said that while we may treat deep learning as sort of a silver bullet, it isn't necessarily

1:50.7

so. And in fact, if you talk to a machine learning expert in the field, it will very much

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from OCDevel, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of OCDevel and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.