Beyond the Roadmap
Finished the roadmap? This section shows where you stand as an AI engineer, what gaps remain, and what to build next.
Start with Knowledge Gaps to see what areas need more depth, then check What's Left to plan next steps.
📐 What to Study Per Area
Exact gaps, required math, and what to explore next — per knowledge domain
📊 Do You Need Statistics?
Yes — but selectively. Statistics is the most important math topic for AI, but you only need 4 specific concepts:
Probability distributions
Understand how models sample outputs (softmax, temperature)
Cross-entropy loss
THE loss function for LLMs. You'll see it everywhere.
KL divergence
Used in RLHF, VAEs, and fine-tuning. Measures 'distance' between distributions.
Bayes theorem
Conceptual understanding of how evidence updates beliefs — useful for RAG intuition.
You do NOT need: hypothesis testing, regression analysis, ANOVA, or most classical statistics. Those are for data science, not AI engineering.
LLM Concepts & Internals
Light math (conceptual)Prompt Engineering
No math neededRAG Systems
Light math (conceptual)Agentic AI / Tool Use
No math neededFine-Tuning / Training
Medium math (applied)Multimodal AI
Light math (conceptual)ML Research / Math
Structured math study neededProduction / MLOps
No math neededAI Safety & Ethics
No math needed🎯 Recommended Study Order (For Your Profile)
Statistics (4 concepts only)
Probability distributions, cross-entropy, KL divergence, Bayes. Use StatQuest.
Linear Algebra (visually)
3Blue1Brown Essence of Linear Algebra. 3 hrs. Unlocks LLM math intuition.
LoRA mechanics
Read the original LoRA paper after linear algebra. It will make sense now.
Multimodal — vision prompting
Immediate ROI. Claude and GPT-4 vision are already in your hands.
LangSmith / observability
Install it in your next project. Transforms how you build and debug.
Calculus (chain rule only)
3Blue1Brown Essence of Calculus. 2 hrs. Makes backprop click.