1:03:40 LLM Fine-Tuning 10: LLM Knowledge Distillation | How to Distill LLMs (DistilBERT & Beyond) Part 1 Sunny Savita 4.3K views - 7 months ago
29:14 Knowledge Distillation Simplified | Teacher to Student Model for LLMs (Step-by-Step with Demo) #ai Unfold Data Science 1.7K views - 6 months ago
1:00:11 EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2023) MIT HAN Lab 11.8K views - 2 years ago
24:11 Knowledge Distillation in Machine Learning: Full Tutorial with Code MLWorks 3.6K views - 10 months ago
13:01 Teacher-Student Neural Networks: The Secret to Supercharged AI Computing For All 7.2K views - 2 years ago
7:21 Knowledge Distillation in Deep Learning - DistilBERT Explained Dingu Sagar 19.4K views - 4 years ago
12:09 How to Distill LLM? LLM Distilling [Explained] Step-by-Step using Python Hugging Face AutoTrain FreeBirds Crew - Data Science and GenAI 7.1K views - 1 year ago
18:42 Image Classification: Convolution, Attention (Image Transformers & Distillation w/Attention, DeiT) Benjamin Ricard, PhD 4.5K views - 5 years ago
3:13:39 Practicing the Wisdom of Hypatia #3: Distilled Lessons 11-15 w/Cognitive Scientist Dr. John Vervaeke The Meditating Philosopher 491 views - 5 years ago
19:46 Quantization vs Pruning vs Distillation: Optimizing NNs for Inference Efficient NLP 61.3K views - 2 years ago
16:49 Better not Bigger: Distilling LLMs into Specialized Models Snorkel AI 11.7K views - 2 years ago