Redirigiendo al acceso original de articulo en 19 segundos...
ARTÍCULO
TITULO

Contrastive Learning for Graph-Based Vessel Trajectory Similarity Computation

Sizhe Luo    
Weiming Zeng and Bowen Sun    

Resumen

With the increasing popularity of automatic identification system AIS devices, mining latent vessel motion patterns from AIS data has become a hot topic in water transportation research. Trajectory similarity computation is a fundamental issue to many maritime applications such as trajectory clustering, prediction, and anomaly detection. However, current non-learning-based methods face performance and efficiency issues, while learning-based methods are limited by the lack of labeled sample and explicit spatial modeling, making it difficult to achieve optimal performance. To address the above issues, we propose CLAIS, a contrastive learning framework for graph-based vessel trajectory similarity computation. A combined parameterized trajectory augmentation scheme is proposed to generate similar trajectory sample pairs and a constructed spatial graph of the study region is pretrained to help model the input trajectory graph. A graph neural network encoder is used to extract spatial dependency from the trajectory graph to learn better trajectory representations. Finally, a contrastive loss function is used to train the model in an unsupervised manner. We also propose an improved experiment and three related metrics and conduct extensive experiments to evaluate the performance of the proposed framework. The results validate the efficacy of the proposed framework in trajectory similarity calculation.

 Artículos similares

       
 
Xiaodong Cui, Zhuofan He, Yangtao Xue, Keke Tang, Peican Zhu and Jing Han    
Underwater Acoustic Target Recognition (UATR) plays a crucial role in underwater detection devices. However, due to the difficulty and high cost of collecting data in the underwater environment, UATR still faces the problem of small datasets. Few-shot le... ver más

 
Somaiyeh Dehghan and Mehmet Fatih Amasyali    
BERT, the most popular deep learning language model, has yielded breakthrough results in various NLP tasks. However, the semantic representation space learned by BERT has the property of anisotropy. Therefore, BERT needs to be fine-tuned for certain down... ver más
Revista: Applied Sciences

 
Dawei Luo, Heng Zhou, Joonsoo Bae and Bom Yun    
Reliability and robustness are fundamental requisites for the successful integration of deep-learning models into real-world applications. Deployed models must exhibit an awareness of their limitations, necessitating the ability to discern out-of-distrib... ver más
Revista: Applied Sciences

 
Yubo Zheng, Yingying Luo, Hengyi Shao, Lin Zhang and Lei Li    
Contrastive learning, as an unsupervised technique, has emerged as a prominent method in time series representation learning tasks, serving as a viable solution to the scarcity of annotated data. However, the application of data augmentation methods duri... ver más
Revista: Applied Sciences

 
Esmaeil Zahedi, Mohamad Saraee, Fatemeh Sadat Masoumi and Mohsen Yazdinejad    
Unsupervised anomalous sound detection, especially self-supervised methods, plays a crucial role in differentiating unknown abnormal sounds of machines from normal sounds. Self-supervised learning can be divided into two main categories: Generative and C... ver más
Revista: Algorithms