TAIL

A Terrain-Aware Multi-Modal SLAM Dataset in Deformable Granular Environments

TAIL (Terrain-Aware MultI-ModeL) SLAM dataset

Terrain-aware perception holds the potential to improve the robustness and accuracy of autonomous robot navigation in unstructured environments, thereby facilitating effective off-road traversals. TAIL (Terrain-Aware MultI-ModeL) dataset is proposed to support research in navitation for traversing through deformable, granular terrains utilizing wheeled and legged robots. The overall goal of the TAIL dataset is to help developing SLAM techniques for different robot platforms in unstructured, deformable sandy terrains.

Main characteristics

  • Recorded data are from multiple sensors, including a 3D LiDAR, a stereo frame camera, three ground-pointing RGB-D cameras, an IMU and an RTK-GPS device. Moreover, it provides kinematic parameters of both wheeled and quadruped robots within similar scenes while considering distinct motion characteristics. Detailed information is shown in [Dataset System].

  • The data were collected on two beaches at the Double-Moon Bay, covering a wide scope of environmental perception data (surrounding and ground-pointing), terrain complexities (texture-less sandy soil, fine sand, coarse sand), and scene changes (illumination, moving objects, flowing sand). More related information is shown in [Dataset Description].

  • Several state-of-the-art (SOTA) SLAM algorithms are benchmarked using TAIL dataset and their performances are analyzed with the provided ground truth.

  • TAIL dataset and related resources would be released publicly on this website.

News

  • 2024-5-15: The TAIL dataset has been accepcted to IEEE Robotics and Automation Letters.
  • 2024-5-13: The TAIL-Plus dataset has been presented at ICRA 2024 Workshop on Field Robotics.
  • 2024-5-5: The download links are available.
  • 2024-3-26: The preprint of our paper is aviliable on arxiv.
  • 2024-3-25: Our dataset website is released.
  • 2023-6-25: Our dataset begins.

Publications

We would appreciate it if you use our dataset and cite our papers.

TAIL:

@article{tail2023yao,
  title={TAIL: A Terrain-Aware Multi-Modal SLAM Dataset for Robot Locomotion in Deformable Granular Environments},
  author={Yao, Chen and Ge, Yangtao and Shi, Guowei and Wang, Zirui and Yang, Ningbo and Zhu, zheng and Wei, Hexiang and Zhao, Yuntian and Wu, Jing and Jia, Zhenzhong},
  journal = {arXiv preprint arXiv:2403.16875},
  year = {2024}
}

TAIL-Plus:

@misc{plus2024wang,
      title={Are We Ready for Planetary Exploration Robots? The TAIL-Plus Dataset for SLAM in Granular Environments}, 
      author={Wang, Zirui and Yao, Chen and Ge, Yangtao and Shi, Guowei and Yang, Ningbo and Zhu, zheng and Dong, Kewei and Wei, Hexiang and Jia, Zhenzhong and Wu, Jing},
      booktitle = {IEEE ICRA 2024 Workshop on Field Robotics},
      year = {2024}
}

License

This work is released under GPLv3 license. For commercial use, please contact Chen Yao.