Commit Graph

6 Commits

Author SHA1 Message Date
rehman
09bdfd6628 fix: truncate conversation tokens to model context window in chat_cli
Fixes #581. When conversation_tokens grows beyond model.config.sequence_len,
engine.generate() received a zero-dimension tensor and crashed with a matmul
shape error. Add a sliding window guard before each generate() call that keeps
the most recent (sequence_len - max_new_tokens) tokens, re-inserts bos to
preserve a well-formed sequence, and notifies the user when truncation occurs.
2026-05-06 01:29:20 +05:00
Andrej Karpathy
1076f97059 delete autocast, an unnecessary thorn in my side, manage dtypes directly 2026-03-04 23:55:30 +00:00
Sofie Van Landeghem
72b9064f9d
remove leftover mid references (#491) 2026-02-02 08:33:46 -08:00
Andrej Karpathy
1ddaad1c1c nuke midtraining from orbit, it's not as needed now that we have a BOS-aligned dataloader. Also change the README a lot. midtrianing is not yet fully properly erased across the board, but good enough for step 1 2026-01-31 19:12:25 +00:00
karpathy
2e9669e03a upgrading all other files to be able to use cpu/mps as well as cuda. various minor other changes ,e.g. changing max_iterations to num_iterations in sft script for consistency in naming 2025-10-20 10:15:17 -07:00
karpathy
3a5e0bc50b initial commit 2025-10-13 06:49:24 -07:00