Full Text:  <480>

CLC number: 

On-line Access: 2024-08-27

Received: 2023-10-17

Revision Accepted: 2024-05-08

Crosschecked: 0000-00-00

Cited: 0

Clicked: 813

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering 

Accepted manuscript available online (unedited version)


Toward an accurate mobility trajectory recovery using contrastive learning


Author(s):  Yushan LIU, Yang CHEN, Jiayun ZHANG, Yu XIAO, Xin WANG

Affiliation(s):  Shanghai Key Lab of Intelligent Information Processing, School of Computer Science, Fudan University, China; more

Corresponding email(s):  chenyang@fudan.edu.cn

Key Words:  Human mobility; Mobility trajectory recovery; Contrastive learning


Share this article to: More <<< Previous Paper|Next Paper >>>

Yushan LIU, Yang CHEN, Jiayun ZHANG, Yu XIAO, Xin WANG. Toward an accurate mobility trajectory recovery using contrastive learning[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2300647

@article{title="Toward an accurate mobility trajectory recovery using contrastive learning",
author="Yushan LIU, Yang CHEN, Jiayun ZHANG, Yu XIAO, Xin WANG",
journal="Frontiers of Information Technology & Electronic Engineering",
year="in press",
publisher="Zhejiang University Press & Springer",
doi="https://doi.org/10.1631/FITEE.2300647"
}

%0 Journal Article
%T Toward an accurate mobility trajectory recovery using contrastive learning
%A Yushan LIU
%A Yang CHEN
%A Jiayun ZHANG
%A Yu XIAO
%A Xin WANG
%J Frontiers of Information Technology & Electronic Engineering
%P
%@ 2095-9184
%D in press
%I Zhejiang University Press & Springer
doi="https://doi.org/10.1631/FITEE.2300647"

TY - JOUR
T1 - Toward an accurate mobility trajectory recovery using contrastive learning
A1 - Yushan LIU
A1 - Yang CHEN
A1 - Jiayun ZHANG
A1 - Yu XIAO
A1 - Xin WANG
J0 - Frontiers of Information Technology & Electronic Engineering
SP -
EP -
%@ 2095-9184
Y1 - in press
PB - Zhejiang University Press & Springer
ER -
doi="https://doi.org/10.1631/FITEE.2300647"


Abstract: 
Human mobility trajectories are fundamental resources for analyzing mobile behaviors in urban computing applications. However, these trajectories, typically collected from location-based services, often suffer from sparsity and irregularity in time. To support the development of mobile applications, there is a need to recover or estimate missing locations of unobserved time slots in these trajectories at a fine-grained spatialâĂŞtemporal resolution. Existing methods for trajectory recovery rely on either individual user trajectories or collective mobility patterns from all users. The potential to combine individual and collective patterns for precise trajectory recovery remains unexplored. Additionally, current methods are sensitive to the heterogeneous temporal distributions of the observable trajectory segments. In this paper, we propose CLMove (where CL stands for contrastive learning), a novel model designed to capture multilevel mobility patterns and enhance robustness in trajectory recovery. CLMove features a two-stage location encoder that captures collective and individual mobility patterns. The graph neural network (GNN)-based networks in CLMove explore location transition patterns within a single trajectory and across various user trajectories. We also design a trajectory-level contrastive learning task to improve the robustness of the model. Extensive experimental results on three representative real-world datasets demonstrate that our CLMove model consistently outperforms state-of-the-art methods in terms of accuracy of trajectory recovery.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE