Skip to content
forked from zxlzr/MTM

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

Notifications You must be signed in to change notification settings

hongshengxin/FewShotNLP

 
 

Repository files navigation

Meta-pretraining Then Meta-learning (MTM) Model for FewShot NLP Tasks

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

If you use the code, pleace cite the following paper:

@inproceedings{zhang2019fewshot,
  title={Improving Few-shot Text Classification via Pretrained Language Representations},
  author={Ningyu Zhang, Zhanlin Sun, Shumin Deng, Jiaoyan Chen, Huajun Chen},
  year={2019}
}

@inproceedings{zhang2019mtm,
  title={When Low Resource NLP Meets Unsupervised Language Model:  Meta-pretraining Then Meta-learning for Few-shot Text Classification},
  author={Shumin Deng, Ningyu Zhang, Zhanlin Sun, Jiaoyan Chen, Huajun Chen},
  year={2019}
}

About

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 55.5%
  • Jupyter Notebook 44.3%
  • Other 0.2%