Six-month study plan

A Grade 9 ramp from "knows Python" to "can fine-tune a small transformer on a custom dataset." Roughly 6 focused hours per week, with a notebook portfolio that grows month by month.

Working artifact. Keep one Git repository with a folder per month. Every week ends with at least one runnable notebook checked in. By month 6 you'll have a real portfolio.

Targets by phase

PhaseOutcome
Months 1–2Math fluency + Python data stack; can manipulate any tabular dataset.
Months 3–4Classical ML mastery; can baseline any tabular contest problem with sklearn in one sitting.
Month 5PyTorch fluency; can write a training loop and an MLP from scratch.
Month 6Transformer fundamentals; can implement scaled dot-product attention from memory and fine-tune a small pretrained model.

Month 1 · Math + Python warm-up

Month 2 · Data wrangling + visualization

Month 3 · Classical ML (supervised)

Month 4 · Classical ML (unsupervised + ensembles)

Month 5 · PyTorch + deep learning

Month 6 · Transformers + fine-tuning

Practice cadence

The notebook rule. Every week ends with one notebook checked into Git that runs end-to-end on a fresh machine. No half-finished, no out-of-order cells.
Don't skip the math. The temptation when fine-tuning a transformer is to treat it as a black box. The contest's theory section punishes that — derive gradients on paper at least once per major topic.
Kaggle Getting Started cadence. One Kaggle dataset per month, with a real submission and a leaderboard score. The feedback loop is faster than any textbook.

Weekly checklist template