Skip to main content

6.0 Study Guide and Task Sheet: Deep Learning and Transformer Basics

Deep learning study guide training loop

The main study route is now in Chapter 6 entry. Use this page only as a quick checklist while you practice.

One-Line Mental Model

batch data -> model forward -> loss -> backward gradients -> optimizer step -> curves

If the code feels long, find these six steps first.

Practice Checklist

CheckEvidence
I can explain forward, loss, backward, optimizertraining-loop note
I can run a minimal PyTorch scripttrain.py
I can print tensor shapes through a modelshape trace
I can compare training and validation curvescurve image or CSV
I can explain what Attention changesattention note
I can finish the evidence-pack workshopdeep_learning_workshop_run/

Evidence Rubric

ArtifactIt should answer
Training-loop noteWhat happens in forward, loss, backward, and optimizer step?
Shape traceHow do tensor shapes change through the model?
Curve image or CSVIs the model underfitting, overfitting, or improving steadily?
Attention noteWhat information does attention add, and what remains hard?
Failure sample noteWhich sample fails, and what does that tell you about data, model, or labels?

Ready To Continue

Continue to Chapter 7 when you can train one small model, save the training log, inspect failure cases, and explain why the model improved or failed.