Math for Machine Learning I
MathML 1 is a foundational course in mathematics designed specifically for beginners in machine learning. It covers essential topics like numbers and notation, functions, vectors, matrices, and basic calculus—not through heavy theory, but through intuitive explanations and practical examples. The goal is to build your mathematical mindset so you can confidently approach machine learning, even if you’re starting from scratch.
Important Course Resources
- Essential of Linear Algebra by 3Blue1Brown
- The Manga Guide to Calculus
- Precalculus and Linear Algebra notes from MIT OpenCourseWare
How this Works
Formal education gives us structure, but it often skips over the kind of intuitive math understanding that machine learning actually relies on. MathML 1 exists to fill in those gaps. It’s a self-paced, digital-first companion course—free to explore anytime, anywhere—that prioritizes clarity over complexity.
This course blends conceptual lessons with real-world examples and optional hands-on exercises, encouraging you to explore, reflect, and apply what you learn as you go. Whether you’re coming from a chemistry lab, a coding bootcamp, or a creative field, this is your safe space to finally “get” the math behind ML.
You’re invited to slow down, think visually, ask questions, and engage with the material deeply. Use the resources provided, revisit lessons as needed, and most importantly—don’t rush. You’re here to build mastery, not just memorize formulas.
Course Content
- Introduction to Foundational Mathematics
- What is Mathematical Maturity in Machine Learning
- Role of Mathematics in Modern Algorithms and Representations
- Key Mathematical Domains Used in ML (Linear Algebra, Calculus, Optimization)
- Notation, Formalism, and Symbolic Conventions
- Functions and Algebraic Structures
- Types of Functions (Polynomial, Exponential, Logarithmic, Rational, Piecewise)
- Function Operations: Composition, Inversion, Transformation
- Injective, Surjective, and Bijective Mappings
- Function Growth Rates and Asymptotics
- Algebraic Manipulation: Identities, Factoring, Rearranging, Substitution
- Boolean Algebra and Function Logic (AND, OR, NOT, XOR for function combinations)
- Vectors and Normed Spaces
- Definition and Notation of Vectors in ℝⁿ
- Vector Addition and Scalar Multiplication
- Inner Products and Angle Between Vectors
- Norms: L₁, L₂, L∞ Norms and Their Interpretations
- Distance Metrics in Vector Spaces
- Projections and Orthogonality in ℝⁿ
- Vector Normalization and Unit Vectors
- Matrices and Basic Linear Operators
- Definitions and Types of Matrices (Square, Diagonal, Symmetric, Sparse)
- Matrix Addition, Scalar Multiplication, and Transposition
- Matrix Multiplication and Composition of Linear Maps
- Identity and Inverse Matrices (Conditions for Invertibility)
- Matrix Representations of Linear Systems
- Block Matrices and Partitioning Techniques
- Calculus I: Differentiation in One Variable
- Limits, Continuity, and the Notion of a Derivative
- Differentiation Rules: Product, Quotient, Chain Rule
- Higher-Order Derivatives
- Taylor Series Expansion and Approximation
- Optimization with Derivatives: Critical Points, Convexity, Concavity
- Sensitivity Analysis Using Derivatives
- Calculus II: Multivariable Differentiation
- Partial Derivatives and Mixed Derivatives
- The Gradient and Its Geometric Interpretation
- Directional Derivatives and Level Sets
- The Jacobian Matrix and Function Transformations
- Hessians and Second-Order Derivatives
- Applications for Optimization and Gradient-Based Algorithms
- Elementary Real Analysis (For ML Fluency)
- Supremum, Infimum, and Bounds
- Sequences and Convergence
- Series and Absolute Convergence
- ε-δ Definition of a Limit
- Continuity and Uniform Continuity
- Differentiability and Mean Value Theorem
- Vector Calculus and Higher Dimensions
- Scalar and Vector Fields
- Gradient, Divergence, Curl
- Line Integrals and Surface Integrals
- Change of Variables in Multiple Dimensions
- Jacobians for Multivariable Transformation
- Coordinate Systems and Geometric Transformations in ML
- Introduction to Optimization Theory
- Basic Notions of Minima and Maxima in ℝ and ℝⁿ
- Convex Functions and Convex Sets
- First and Second Order Conditions for Optimality
- Lagrange Multipliers and Constrained Optimization
- Gradient Descent: Convergence and Stability Conditions
- Line Search, Step Size Rules, and Descent Directions
- Final Notes and Roadmap
- How MathML1 Prepares You for Machine Learning Models
- Transitioning to MathML2: Linear Algebra, Spectral Theory, Advanced Optimization
- Suggested Problems and Supplementary Readings
Contact Me
You may encounter some roadblocks or challenges while navigating through this course, and questions may arise along the way. Don’t worry. If you need help, clarification, or support, please don’t hesitate to reach out to me. I’m here to help in any way I can to ensure you have a successful and enjoyable experience.
- Email: martin@chemolytics.com
- Socials: Twitter | Linkedin
I’ll do my best to respond to all inquiries within 24 to 48 hours during the weekdays. I’ll welcome your concerns – whether you have specific questions, need assistance with projects, or just simply want to discuss your progress, I’m here 😊
Additionally, if you encounter technical issues or have feedback regarding the course, please let me know. Your input is valuable and helps improve the learning experience for everyone.
Remember, seeking help is an important part of the learning process, and I am here to support you every step of the way!