One key chapter is about gradient descent. You’ll learn to define loss functions, compute derivatives, and update parameters step by step — just like training a model manually.
16
108 reads
CURATED FROM
IDEAS CURATED BY
Discover how Data Science from Scratch by Joel Grus teaches you to build every core concept — from statistics to machine learning — using pure Python. No shortcuts, just deep understanding. Ideal for those who want to truly master data science fundamentals.
“
Similar ideas to 4.Gradient Descent: Learning by Optimization
Transfer learning consists of taking features learned on one problem, and leveraging them on a new, similar problem. For instance, features from a model that has learned to identify racoons may be useful to kick-start a model meant to identify tanukis.
Most of us have a preferred way of learning. Get to know the learning style you're most comfortable with and study in the ways you learn best.
Note that these styles are just a way to think about diffent studying techniques – they're not hard and fast rules that say you should only study in...
Read & Learn
20x Faster
without
deepstash
with
deepstash
with
deepstash
Personalized microlearning
—
100+ Learning Journeys
—
Access to 200,000+ ideas
—
Access to the mobile app
—
Unlimited idea saving
—
—
Unlimited history
—
—
Unlimited listening to ideas
—
—
Downloading & offline access
—
—
Supercharge your mind with one idea per day
Enter your email and spend 1 minute every day to learn something new.
I agree to receive email updates