|
|
|
Weijun Pan, Peiyuan Jiang, Zhuang Wang, Yukun Li and Zhenlong Liao
In recent years, the emergence of large-scale pre-trained language models has made transfer learning possible in natural language processing, which overturns the traditional model architecture based on recurrent neural networks (RNN). In this study, we c...
ver más
|
|
|
|
|
|
|
Huajie Wang and Yinglin Wang
The natural language model BERT uses a large-scale unsupervised corpus to accumulate rich linguistic knowledge during its pretraining stage, and then, the information is fine-tuned for specific downstream tasks, which greatly improves the understanding c...
ver más
|
|
|
|
|
|
|
Jun Xu and Lei Hu
Place descriptions record qualitative information related to places and their spatial relationships; thus, the geospatial semantics of a place can be extracted from place descriptions. In this study, geotagged microblog short texts recorded in 2017 from ...
ver más
|
|
|
|
|
|
|
Yu Wang, Yining Sun, Zuchang Ma, Lisheng Gao and Yang Xu
Named Entity Recognition (NER) is the fundamental task for Natural Language Processing (NLP) and the initial step in building a Knowledge Graph (KG). Recently, BERT (Bidirectional Encoder Representations from Transformers), which is a pre-training model,...
ver más
|
|
|
|
|
|
|
Ernie Hendrawaty(1), (1) Department of Management, Faculty of Economics and Business, Universitas Lampung, Indonesia
Pág. 71 - 80
|
|
|
|
|
|
|
Frederick H. Peens, Ernie H.G. Langner
Pág. 1 bladsy
?
|
|
|
|
|
|
|
Gojanovic, Tony; Jimenez, Ernie
|
|
|
|
|
|
|
Jim Underwood,Ernie Jordan
This paper, examining Information Systems in New South Wales universities, highlights the significance of New South Wales as the most populous state in Australia. Rather than offering a comprehensive coverage of all Information Systems courses in the sta...
ver más
|
|
|
|
|
|
|
Scott Contini; Ernie Croot; Igor E. Shparlinski
Pág. 983 - 996
|
|
|
|