SITUATE: Indoor Human Trajectory Prediction through Geometric Features and Self-Supervised Vision Representation

1 Department of Engineering for Innovation Medicine, University of Verona, Italy 2 Visvesvaraya National Institute of Technology, Nagpur, India
🎉 Accepted @ ICPR 2024 🎉
MTL-Split teaser
Examples of different trajectories from the Supermarket [10] dataset to show the difficulty of the indoor trajectory prediction task. In particular, the dataset showcases long trajectories (Person 4), self-loops (Person 1 and Person 3), and confusing movements (Person 2) performed in an environment that strongly affects the people's paths. Specifically, the red circle represents the starting point of a trajectory, and the yellow star represents its final point.

Abstract

Patterns of human motion in outdoor and indoor environments are substantially different due to the scope of the scene and the typical intentions of people therein. While outdoor trajectory forecasting has received significant attention, indoor forecasting is still an underexplored research area. This paper proposes SITUATE, a novel approach to cope with indoor human trajectory prediction by leveraging equivariant and invariant geometric features and self-supervised vision representation. The geometric learning modules model the intrinsic symmetries and human movements inherent in indoor spaces. This concept becomes particularly important because self-loops at various scales and rapid direction changes often characterize indoor trajectories. On the other hand, the vision representation module is used to acquire spatial-semantic information about the environment to predict users' future locations more accurately. We evaluate our method through comprehensive experiments on the two most famous indoor trajectory forecasting datasets, i.e., THÖR and Supermarket, obtaining state-of-the-art performance. Furthermore, we also achieve competitive results in outdoor scenarios, showing that indoor-oriented forecasting models generalize better than outdoor-oriented ones.

BibTeX

@InProceedings{capogrosso2024situate,
    author    = {Capogrosso, Luigi and Toaiari, Andrea and Avogaro, Andrea and Khan, Uzair and Jivoji, Aditya and Fummi, Franco and Cristani, Marco},
    booktitle = {27th International Conference on Pattern Recognition (ICPR)},
    title     = {{SITUATE: Indoor Human Trajectory Prediction Through Geometric Features and Self-supervised Vision Representation}},
    year      = {2024},
    doi       = {10.1007/978-3-031-78444-6_24},
}