The Time I Tried to Train a Model on My GPU and Everything Caught Fire

Note to self: when people say "don't train large models on a laptop," believe them.

I ignored this. Spun up a fine-tune job on my local 3060. It ran for about 12 minutes before the fans went crazy, the power brick got hot enough to fry an egg, and the model started throwing NaNs.

Ended up just renting compute like a normal person. But I still feel like every ML dev has to do this once. Like touching the stove as a kid. Gotta learn.