meta_pixel
Tapesearch Logo
Log in
Machine Learning Guide

MLA 013 Tech Stack for Customer-Facing Machine Learning Products

Machine Learning Guide

OCDevel

Artificial, Introduction, Learning, Courses, Technology, Ml, Intelligence, Ai, Machine, Education

4.9848 Ratings

🗓️ 3 January 2021

⏱️ 48 minutes

🧾️ Download transcript

Summary

Primary technology recommendations for building a customer-facing machine learning product include React and React Native for the front end, serverless platforms like AWS Amplify or GCP Firebase for authentication and basic server/database needs, and Postgres as the relational database of choice. Serverless approaches are encouraged for scalability and security, with traditional server frameworks and containerization recommended only for advanced custom backend requirements. When serverless options are inadequate, use Node.js with Express or FastAPI in Docker containers, and consider adding Redis for in-memory sessions and RabbitMQ or SQS for job queues, though many of these functions can be handled by Postgres. The machine learning server itself, including deployment strategies, will be discussed separately.

Links

Client Applications

  • React is recommended as the primary web front-end framework due to its compositional structure, best practice enforcement, and strong community support.
  • React Native is used for mobile applications, enabling code reuse and a unified JavaScript codebase for web, iOS, and Android clients.
  • Using React and React Native simplifies development by allowing most UI logic to be written in a single language.

Server (Backend) Options

  • The episode encourages starting with serverless frameworks, such as AWS Amplify or GCP Firebase, for rapid scaling, built-in authentication, and security.
    • Amplify allows seamless integration with React and handles authentication, user management, and database access directly from the client.
    • When direct client-to-database access is insufficient, custom business logic can be implemented using AWS Lambda or Google Cloud Functions without managing entire servers.
  • Only when serverless frameworks are insufficient should developers consider managing their own server code.
    • Recommended traditional backend options include Node.js with Express for JavaScript environments or FastAPI for Python-centric projects, both offering strong concurrency support.
    • Using Docker to containerize server code and deploying via managed orchestration (e.g., AWS ECS/Fargate) provides flexibility and migration capability beyond serverless.
    • Python's FastAPI is advised for developers heavily invested in the Python ecosystem, especially if machine learning code is also in Python.

Database and Supporting Infrastructure

  • Postgres is recommended as the primary relational database, owing to its advanced features, community momentum, and versatility.
    • Postgres can serve multiple infrastructure functions beyond storage, including job queue management and pub/sub (publish-subscribe) messaging via specific database features.
  • NoSQL options such as MongoDB are only recommended when hierarchical, non-tabular data models or specific performance optimizations are necessary.
  • For situations requiring in-memory session management or real-time messaging, Redis is suggested, but Postgres may suffice for many use cases.
  • Job queuing can be accomplished with external tools like RabbitMQ or AWS SQS, but Postgres also supports job queuing via transactional locks.

Cloud Hosting and Server Management

  • Serverless deployment abstracts away infrastructure operations, improving scalability and reducing ongoing server management and security burdens.
    • Serverless functions scale automatically and only incur charges during execution.
  • Amplify and Firebase offer out-of-the-box user authentication, database, and cloud function support, while custom authentication can be handled with tools like AWS Cognito.
  • Managed database hosting (e.g., AWS RDS for Postgres) simplifies backups, scaling, and failover but is distinct from full serverless paradigms.

Evolution of Web Architectures

  • The episode contrasts older monolithic frameworks (Django, Ruby on Rails) with current microservice and serverless architectures.
  • Developers are encouraged to leverage modern tools where possible, adopting serverless and cloud-managed components until advanced customization requires traditional servers.

Links

Client

Server

Database, Job-Queues, Sessions

Transcript

Click on a timestamp to play from that location

0:00.0

You're listening to Machine Learning Applied, and in this episode, we're going to talk about Tech Stack.

0:05.7

Now, I know I've talked TechStack a lot in the past, but we're going to get a little bit more specific at this time, and we're going to cover a broader spectrum.

0:12.6

We're going to talk about client, mobile, server database, and Job Server, job server being your machine learning server.

0:21.6

And I'm going to make very specific technology recommendations in this episode. And it's intended for people who really

0:26.8

don't have an opinion maybe one way or another or aren't using some specific web front end framework

0:32.8

or cloud hosting provider. If you have your tried and true tech stack and you like what you like

0:39.2

and you're using what you're using, you can go ahead and skip this episode. But if you'd like some

0:43.1

recommendations on where to start, building out a machine learning customer facing product,

0:47.7

then this will be a good episode for you. So this episode assumes that we're talking about

0:52.5

a customer facing machine learning product.

0:54.9

If you're going to be developing machine learning for a research project where you have a

1:00.4

stakeholder who wants to know the predictions based on some data set or they want to see some

1:05.6

charts and graphs or some reports.

1:07.7

And then what you do is you'd develop your machine learning model.

1:10.3

You'd train it

1:10.9

on your workstation or do your own thing for parallelizing your machine learning training

1:15.9

across multiple servers or workstations. And then you'd get results back, be they reports or

1:21.9

predictions and hand that back to your stakeholder. That's not what this episode about. This is

1:26.4

about a customer-facing machine learning product where you have a web front-end,

1:32.3

a website, or a mobile app, and you have customers signing up on your website, and so on.

1:38.8

Here I will talk about the technologies I use personally and that I find valuable and prefer over their competition.

1:46.2

So the full tech stack goes like this.

...

Please login to see the full transcript.

Disclaimer: The podcast and artwork embedded on this page are from OCDevel, and are the property of its owner and not affiliated with or endorsed by Tapesearch.

Generated transcripts are the property of OCDevel and are distributed freely under the Fair Use doctrine. Transcripts generated by Tapesearch are not guaranteed to be accurate.

Copyright © Tapesearch 2025.