meta_pixel
Tapesearch Logo
Log in
Machine Learning Guide

MLG 032 Cartesian Similarity Metrics

Machine Learning Guide

OCDevel

Artificial, Introduction, Learning, Courses, Technology, Ml, Intelligence, Ai, Machine, Education

4.9848 Ratings

🗓️ 8 November 2020

⏱️ 42 minutes

🧾️ Download transcript

Summary

Support my new podcast: Lefnire's Life Hacks

Show notes at ocdevel.com/mlg/32.

L1/L2 norm, Manhattan, Euclidean, cosine distances, dot product


Normed distances link

  • A norm is a function that assigns a strictly positive length to each vector in a vector space. link
  • Minkowski is generalized. p_root(sum(xi-yi)^p). "p" = ? (1, 2, ..) for below.
  • L1: Manhattan/city-block/taxicab. abs(x2-x1)+abs(y2-y1). Grid-like distance (triangle legs). Preferred for high-dim space.
  • L2: Euclidean. sqrt((x2-x1)^2+(y2-y1)^2sqrt(dot-product). Straight-line distance; min distance (Pythagorean triangle edge)
  • Others: Mahalanobis, Chebyshev (p=inf), etc

Dot product

  • A type of inner product.
    Outer-product: lies outside the involved planes. Inner-product: dot product lies inside the planes/axes involved link. Dot product: inner product on a finite dimensional Euclidean space link

Cosine (normalized dot)

Transcript

Click on a timestamp to play from that location

0:00.0

Welcome back to Machine Learning Guide. I'm your host, Tyler Rinelli.

0:05.0

MLG teaches the fundamentals of machine learning and artificial intelligence.

0:09.0

It covers intuition, models, math, languages, frameworks, and more.

0:13.0

Where your other machine learning resources provide the trees, I provide the forest.

0:18.0

Visual is the best primary learning modality, but audio is a great supplement during exercise commute and chores.

0:25.6

Consider MLG your syllabus with highly curated resources for each episode's details at OCdevel.com forward slash MLG.

0:35.6

I'm also starting a new podcast which could use your support. It's called

0:39.9

Lefnear's Life Hacks and teaches productivity focused tips and tricks, some which could prove

0:45.5

beneficial in your machine learning education journey. Find that at Ocdevel.com forward slash

0:51.9

LLH. Today we're going to be talking about Cartesian similarity metrics or Cartesian distance metrics.

1:00.4

The key words here you might have heard in working with machine learning are things like Euclidean distance, Manhattan distance, L1 and L2 norms, cosine distance, and things like this.

1:13.1

So we'll break these all down in this episode.

1:15.6

But before we get into the details,

1:17.6

I want to break down two words here.

1:20.1

One word being Cartesian,

1:22.1

and the other being the distinction between similarity and distance.

1:26.1

So when I say Cartesian, I'm talking about the Cartesian coordinate system.

1:31.7

Renee Descartes, Cartesian, the invention of points in space and how they relate to each

1:36.8

other.

1:37.8

So the Cartesian coordinate system is exactly what you'd expect.

1:41.0

It's an x, y, axis plane, or it's an x, y an xyz in three dimensions and so on into infinity dimensions.

1:47.9

It's space where vectors represent points in space like stars in a galaxy and then you can compare vectors

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from OCDevel, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of OCDevel and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.