A Modular, Configuration-Driven Framework for Knowledge Distillation. Trained models, training logs and configurations are available for ensuring the reproducibility. | |
https://github.com/yoshitomo-matsubara/torchdistill | |
torchdistill-1.0.0 | MIT |
download~amd64 ~x86 | pypi |