Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Implementations of knowledge distillation and knowledge transfer models in neural networks.

Notifications You must be signed in to change notification settings

BKL255/KnowledgeSharing-Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

History

3 Commits

Repository files navigation

KnowledgeSharing-Pytorch

This repository is maintained to implement some state-of-the-art knowledge distillation and knowledge transfer methods.

ToDo List

Knowledge Distillation (KD)

Knowledge distillation was proposed to distill knowledge from a large teacher network to a smaller student network. KD can help the student model to achieve higher generalization performance. It's applications include model compression.

Knowledge Transfer (KT)

Model List

  • Basic knowledge distillation
  • Born-again Neural Networks
  • Knowledge Transfer with Jacobian Matching
  • Deep Mutual Learning
  • Co-teaching
  • One-the-fly Native Ensemble
  • MentorNet

About

Implementations of knowledge distillation and knowledge transfer models in neural networks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.4%
  • Shell 1.6%

AltStyle によって変換されたページ (->オリジナル) /