|
|
|
Osiris Juárez, Salvador Godoy-Calderon and Hiram Calvo
This work proposes a working set of rules for translating English sentences into the formal language of non-axiomatic logic (NAL). The proposed translation takes advantage of several linguistic tools for pre-processing and can be used for commonsense rea...
ver más
|
|
|
|
|
|
|
Zhihao Zhou, Tianwei Yue, Chen Liang, Xiaoyu Bai, Dachi Chen, Congrui Hetang and Wenping Wang
Harnessing commonsense knowledge poses a significant challenge for machine comprehension systems. This paper primarily focuses on incorporating a specific subset of commonsense knowledge, namely, script knowledge. Script knowledge is about sequences of a...
ver más
|
|
|
|
|
|
|
Huajie Wang and Yinglin Wang
The natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding c...
ver más
|
|
|
|
|
|
|
Mehul Bhatt and Jan Oliver Wallgrün
|
|
|
|