Abstract:Based on the pre-training model BERT and UniLM MASK, the paper proposes a generative BERT which can be applied to the generation of traditional Chinese medicine questions. Combined with the multi-strategy mechanism based on label smoothing, anti-disturbance and knowledge distillation, and the integrated strategy based on multi-model soft voting, the performance and generalization ability of the generative BERT are further improved. It is helpful for the question generation task of traditional Chinese medicine to achieve better results and to make full use of traditional Chinese medicine text data.