DiTer: Diverse Terrain and Multimodal Dataset for Field Robot Navigation in Outdoor Environments

IEEE Sensors Letters
1Spatial AI and Robotics (SPARO) Lab, Inha University, South Korea

TL;DR Multimodal, multi-terrain dataset for navigation in challenging field environments!

Abstract

Field robots require autonomy in diverse environments to navigate and map their surroundings efficiently. However, the lack of diverse and comprehensive datasets hinders the evaluation and development of autonomous field robots. To address this challenge, we present a multimodal, multisession, and diverse terrain dataset for the ground mapping of field robots. First of all, we utilize a quadrupedal robot as a base platform to collect the dataset. Also, the dataset includes various terrain types, such as sandy roads, vegetation, and sloping terrain. It comprises RGB-D camera for ground, RGB camera, thermal camera, light detection and ranging (LiDAR), inertial measurement unit (IMU), and global positioning system (GPS). In addition, we provide not only the reference trajectories of each dataset but also the global map by leveraging LiDAR-based simultaneous localization and mapping (SLAM) algorithms. Also, we assess our dataset from a terrain perspective and generate the fusion maps, such as thermal-LiDAR and RGB-LiDAR maps to exploit the information beyond the visible spectrum.