Datasets of a indoor scenario with artificial smoke

We present a dataset obtained in an underground location with reduced-visibility conditions (artificial smoke). The set consists of logs obtained with differents levels of smoke. In addition, we provide the users with a partial ground-truth and baselines of the localization of the platforms, which can be used for testing localization and SLAM algorithms.

The datasets are timestamped and stored by means of the well-known Robot Operating System (ROS) bag package. The contents of the different bags is detailed in the contents section. Unfortunately, the localization of the platform in such environment is a harsh challenge as it is a GPS-denied environment. We provide a ground truth reference for comparing the results of different methods.


All datasets and benchmarks on this page are copyright by us and published under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License. This means that you must attribute the work in the manner specified by the authors, you may not use this work for commercial purposes and if you alter, transform, or build upon this work, you may distribute the resulting work only under the same license.


Please cite our paper on radar data fusion if you use the dataset for your research. It would be more than welcome!!

Plain text

Alejo, D., Rey, R., Cobano, J. A., Merino, L., Caballero, F.. "Data fusion of low-cost RADAR and LIDAR for reliable ground robot localization under low-visibility conditions". Submitted to International Conference on Robotics & Automation, 2022 (ICRA 2022); 1-7

NIx System Overview

The NIx robotic platform has been used for the presented datasets. It has been designed specifically by the portuguese company IdMind for the Esmera Energy Challenge E2.A1 ”Control and Inspection system for offshore platform crawler” challenge for Urban Robotics of the project . It is a Raposa tracked robot with a tilted mechanism that allows it to climb stairs. Additionally, it is equipped with a NIx ATEX navigation tower. The NIx platform has the following components:

  • • Encoders in the robot base for odometry computation.

  • • Two RGB cameras in the front and rear parts of the platform.

  • • Four IWR6843 intelligent mmWave sensor antenna-on-package from Texas Instrumental RADAR sensors .

  • • One LiDAR sensor: Ouster OS-1-16.

  • • Weight: 40 Kg

  • • Battery autonomy: 3 hours

  • • Maximum Velocity: 0.75 m/s

  • • Acceleration: 1 m/s2

  • • Dimensions of the platform (Height x Width x Length) : 60 x 40 x 20 cm

  • • Dimensions of the navigation tower (Height x Width x Length) : 16 x 16 x 40 cm


The experiments presented in this paper have been executed in the basements of the University Pablo de Olavide in a rectangular-shaped area of 12x6 meters.

In order to have an external and accurate estimate of the position of the robot, we have installed an array of Augmented Reality (AR) markers disposed on the floor of the experimental setup, placed approximately at 70 cm from each other and composing a quasi-regular grid.

With this method, we can estimate the pose of the robot each time an AR marker is detected with a precision of 10 centimeters and 0.1 radians in the worst case scenario. In addition, to obtain a more continuous estimation we use graph optimization again to estimate the poses between AR detections using odometry information.

Data Description

Each one of the provided datasets are stored entirely in a large bag file. This file includes the sensors measurements, odometry. All the datasets are logged and also processed using ROS tools, such as the ROS Bag. Most of the information stored into the logs use ROS standard messages and follows its development main guidelines, so that the reader can understand easily the dataset with a minimum ROS background.

Each bag file stores the following data:

  • • Robot Odometry

  • • Compressed RGB Images (up-left)

  • • LiDAR measures (bottom-left)

  • • RADAR measures (up-right)

  • • Transform information of the disposition of the sensors

  • • Ground truth trajectory (estimated from the deployed AR markers) (green arrow)


Experiment 1

Has a duration of 15:42s receiving 26000+ images, 15000+ RADAR scans for each sensor and 9000+ LiDAR scans.

Contents Download

Experiment 2

Has a duration of 11:48s receiving 17000+ images, 11000+ RADAR scans for each sensor and 7000+ LiDAR scans.

Contents Download

Experiment 3

Has a duration of 8:06s receiving 12000+ images, 8000+ RADAR scans for each sensor and 4800+ LiDAR scans.

Contents Download

Experiment 4

Has a duration of 7:34s receiving 11000+ images, 6000+ RADAR scans for each sensor and 4500+ LiDAR scans.

Contents Download

Experiment 5

Has a duration of 5:35s receiving 8000+ images, 5800+ RADAR scans for each sensor and 3300+ LiDAR scans.

Contents Download

Experiment 6

Has a duration of 4:27s receiving 7000+ images, 4000+ RADAR scans for each sensor and 2600+ LiDAR scans.

Contents Download

Experiment 7

Has a duration of 2:48s receiving 4300+ images, 1600+ RADAR scans for each sensor and 1600+ LiDAR scans.

Contents Download


  • 2021-09-22: First version