nanochat/dev
William Thurston 76227f70d3 Add MOE debug interval and logging for gradient statistics
- Introduced `MOE_DEBUG_INTERVAL` parameter in `runmps.sh` to control debug logging frequency during training.
- Enhanced `base_train.py` to log gradients of routed and shared weights at specified intervals, aiding in monitoring model performance.
- Updated `gpt.py` to adjust router bias calculations, improving load balancing among experts.
- Added unit tests in `test_moe.py` to validate the behavior of the MoE implementation and ensure correctness of gradient calculations.
2025-11-13 16:22:20 -08:00
..
gen_synthetic_data.py add personality to nanochat. breaks previous code on git pull and requires download of a new file from s3, but there is a helpful error message so hopefully its ok 2025-10-21 15:04:58 +00:00
generate_logo.html initial commit 2025-10-13 06:49:24 -07:00
nanochat.png add nanochat logo png 2025-10-13 06:59:59 -07:00
repackage_data_reference.py initial commit 2025-10-13 06:49:24 -07:00
runcpu.sh Add scripts for running evaluations and training with W&B integration 2025-11-05 11:49:50 -08:00
runmps_evals.sh Add scripts for running evaluations and training with W&B integration 2025-11-05 11:49:50 -08:00
runmps.sh Add MOE debug interval and logging for gradient statistics 2025-11-13 16:22:20 -08:00