Chinese small pre-training model MiniRBT

In the field of natural language processing, pre-trained language models (Pre-trained Language Models) have become a very important basic technology. In order to further promote the research and development of Chinese information processing, Harbin Institute of Technology Xunfei Joint Laboratory (HFL) based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking (Whole Word Masking) technology and Knowledge Distillation (Knowledge Distillation) technology launched Chinese Small pre-trained model MiniRBT. Chinese LERT | Chinese and English PERT | Chinese MacBERT | Chinese ELECTRA | Chinese XLNet | Chinese BERT |

#Chinese #small #pretraining #model #MiniRBT


發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *