Abstract:In order to solve the problem of poor entity relation extraction caused by the complexity and diversity of entity relations in traditional Chinese medicine (TCM), a relation extraction model (r-BERT-BiLSTM-attention-textCNN, RBBAT) based on attention mechanism and multi-model fusion is proposed. The model is composed of relation extraction pre-training model (r-BERT), bidirectional long/short-term memory neural network (BiLSTM), Attention layer and TextCNN; In the experiment, the relevant medical records of the department of gastroenterology published on various medical record platforms in recent years were selected, and five entity relationships were extracted, including symptom-disease name, symptom-syndrome, tongue-syndrome, pulse-syndrome, and syndrome-treatment. The experimental results show that compared with several commonly used relation extraction models, the proposed fusion model has the best extraction ability in the four entity relations of symptom-disease name, symptom-syndrome, tongue picture-syndrome, and syndrome-treatment method.