MetaTraj: Meta-Learning for Cross-Scene Cross-Object Trajectory Prediction
2023 (English)In: IEEE transactions on intelligent transportation systems (Print), ISSN 1524-9050, E-ISSN 1558-0016Article in journal (Refereed) Published
Abstract [en]
Long-term pedestrian trajectory prediction in crowds is highly valuable for safety driving and social robot navigation. The recent research of trajectory prediction usually focuses on solving the problems of modeling social interactions, physical constraints and multi-modality of futures without considering the generalization of prediction models to other scenes and objects, which is critical for real-world applications. In this paper, we propose a general framework that makes trajectory prediction models able to transfer well across unseen scenes and objects by quickly learning the prior information of trajectories. The trajectory sequences are closely related to the circumstance setting (e.g. exits, roads, buildings, entries etc.) and the objects (e.g. pedestrians, bicycles, vehicles etc.). We argue that those trajectory information varying across scenes and objects makes a trained prediction model not perform well over unseen target data. To address it, we introduce MetaTraj that contains carefully designed sub-tasks and meta-tasks to learn prior information of trajectories related to scenes and objects, which then contributes to accurate long-term future prediction. Both sub-tasks and meta-tasks are generated from trajectory sequences effortlessly and can be easily integrated into many prediction models. Extensive experiments over several trajectory prediction benchmarks demonstrate that MetaTraj can be applied to multiple prediction models and enables them generalize well to unseen scenes and objects.
Place, publisher, year, edition, pages
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC , 2023.
Keywords [en]
Trajectory prediction, transfer learning, cross-scene, cross-object, meta learning
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
URN: urn:nbn:se:mdh:diva-64166DOI: 10.1109/TITS.2023.3299112ISI: 001051283900001Scopus ID: 2-s2.0-85167800997OAI: oai:DiVA.org:mdh-64166DiVA, id: diva2:1794626
2023-09-062023-09-062023-09-06Bibliographically approved