Resumen
Open-domain event extraction is a fundamental task that aims to extract non-predefined types of events from news clusters. Some researchers have noticed that its performance can be enhanced by improving dependency relationships. Recently, graphical convolutional networks (GCNs) have been widely used to integrate dependency syntactic information into neural networks. However, they usually introduce noise and deteriorate the generalization. To tackle this issue, we propose using Bi-LSTM to obtain semantic representations of BERT intermediate layer features and infuse the dependent syntactic information. Compared to current methods, Bi-LSTM is more robust and has less dependency on word vectors and artificial features. Experiments on public datasets show that our approach is effective for open-domain event extraction tasks.