You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are excited to announce AgentMove, an LLM-based agentic framework designed for zero-shot mobility prediction. Leveraging the world knowledge and sequential modeling capabilities of LLMs, AgentMove paves the way for a promising new direction in mobility prediction.
DeepMove
PyTorch implementation of WWW'18 paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Networks link
Datasets
The sample data to evaluate our model can be found in the data folder, which contains 800+ users and ready for directly used. The raw mobility data similar to ours used in the paper can be found in this public link.
The codes contain four network model (simple, simple_long, attn_avg_long_user, attn_local_long) and a baseline model (Markov). The parameter settings for these model can refer to their res.txt file.
If you find this work helpful, please cite our paper.
@inproceedings{feng2018deepmove,
title={Deepmove: Predicting human mobility with attentional recurrent networks},
author={Feng, Jie and Li, Yong and Zhang, Chao and Sun, Funing and Meng, Fanchao and Guo, Ang and Jin, Depeng},
booktitle={Proceedings of the 2018 world wide web conference},
pages={1459--1468},
year={2018}
}
About
[WWW 2018] DeepMove: Predicting Human Mobility with Attentional Recurrent Network