Resumen
Entity relation extraction mainly extracts relations from text, which is one of the important tasks of natural language processing. At present, some special fields have insufficient data; for example, agriculture, the metallurgical industry, etc. There is a lack of an effective model for entity relationship recognition under the condition of insufficient data. Inspired by this, we constructed a suitable small balanced data set and proposed a multi-neural network collaborative model (RBF, Roberta?Bidirectional Gated Recurrent Unit?Fully Connected). In addition, we also optimized the proposed model. This model uses the Roberta model as the coding layer, which is used to extract the word-level features of the text. This model uses BiGRU (Bidirectional Gated Recurrent Unit)?FC (Fully Connected) as the decoding layer, which is used to obtain the optimal relationship of the text. To further improve the effect, the input layer is optimized by feature fusion, and the learning rate is optimized by the cosine annealing algorithm. The experimental results show that, using the small balanced data set, the F1 value of the RBF model proposed in the paper is 25.9% higher than the traditional Word2vec?BiGRU?FC model. It is 18.6% higher than the recent Bert?BiLSTM (Bidirectional Long Short Term Memory)?FC model. The experimental results show that our model is effective.