From 44213f400b5304daf239b1d637bf771d9bece9a8 Mon Sep 17 00:00:00 2001 From: Minsoo Kim Date: Sun, 26 Feb 2023 00:12:31 +0900 Subject: [PATCH] Update README.md add arxiv link --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 3d23662..599e7de 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@ # Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers -This Repository provides a Pytorch implementation of **Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers** (EACL 2023 Main Track) +This Repository provides a Pytorch implementation of **Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers** (EACL 2023 Main Track) [Paper](https://arxiv.org/abs/2302.11812v1) - This work proposes a proactive knowledge distillation method (Figure 3-(b)) called ***Teacher Intervention*** (TI) for fast converging QAT (Figure 3-c) of ultra-low precision pre-trained Transformers.