Redirigiendo al acceso original de articulo en 20 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 3 (2023)  /  Artículo
ARTÍCULO
TITULO

SelfCCL: Curriculum Contrastive Learning by Transferring Self-Taught Knowledge for Fine-Tuning BERT

Somaiyeh Dehghan and Mehmet Fatih Amasyali    

Resumen

BERT, the most popular deep learning language model, has yielded breakthrough results in various NLP tasks. However, the semantic representation space learned by BERT has the property of anisotropy. Therefore, BERT needs to be fine-tuned for certain downstream tasks such as Semantic Textual Similarity (STS). To overcome this problem and improve the sentence representation space, some contrastive learning methods have been proposed for fine-tuning BERT. However, existing contrastive learning models do not consider the importance of input triplets in terms of easy and hard negatives during training. In this paper, we propose the SelfCCL: Curriculum Contrastive Learning model by Transferring Self-taught Knowledge for Fine-Tuning BERT, which mimics the two ways that humans learn about the world around them, namely contrastive learning and curriculum learning. The former learns by contrasting similar and dissimilar samples. The latter is inspired by the way humans learn from the simplest concepts to the most complex concepts. Our model also performs this training by transferring self-taught knowledge. That is, the model figures out which triplets are easy or difficult based on previously learned knowledge, and then learns based on those triplets in the order of curriculum using a contrastive objective. We apply our proposed model to the BERT and Sentence BERT(SBERT) frameworks. The evaluation results of SelfCCL on the standard STS and SentEval transfer learning tasks show that using curriculum learning together with contrastive learning increases average performance to some extent.

 Artículos similares

       
 
Xiaodong Cui, Zhuofan He, Yangtao Xue, Keke Tang, Peican Zhu and Jing Han    
Underwater Acoustic Target Recognition (UATR) plays a crucial role in underwater detection devices. However, due to the difficulty and high cost of collecting data in the underwater environment, UATR still faces the problem of small datasets. Few-shot le... ver más

 
Dawei Luo, Heng Zhou, Joonsoo Bae and Bom Yun    
Reliability and robustness are fundamental requisites for the successful integration of deep-learning models into real-world applications. Deployed models must exhibit an awareness of their limitations, necessitating the ability to discern out-of-distrib... ver más
Revista: Applied Sciences

 
Yubo Zheng, Yingying Luo, Hengyi Shao, Lin Zhang and Lei Li    
Contrastive learning, as an unsupervised technique, has emerged as a prominent method in time series representation learning tasks, serving as a viable solution to the scarcity of annotated data. However, the application of data augmentation methods duri... ver más
Revista: Applied Sciences

 
Esmaeil Zahedi, Mohamad Saraee, Fatemeh Sadat Masoumi and Mohsen Yazdinejad    
Unsupervised anomalous sound detection, especially self-supervised methods, plays a crucial role in differentiating unknown abnormal sounds of machines from normal sounds. Self-supervised learning can be divided into two main categories: Generative and C... ver más
Revista: Algorithms

 
Ji Zhang, Xiangze Jia, Zhen Wang, Yonglong Luo, Fulong Chen, Gaoming Yang and Lihui Zhao    
Skeleton-based action recognition depends on skeleton sequences to detect categories of human actions. In skeleton-based action recognition, the recognition of action scenes with more than one subject is named as interaction recognition. Different from t... ver más
Revista: Algorithms