Datasets for localization and human robot interaction

We present a novel set of data for the evaluation of visual place recognition in both indoors and outdoors environment in addition to sensor information to evaluate human-robot interactions in crowded areas. The datasets were recorded in the Royal Alcázar of Seville (Spain). We recorded a large set of images sequences from a stereo camera and scan measurements from three laser mounted on a moving robot.

The datasets are timestamped and stored by means of the well-known Robot Operating System (ROS) log functionality. The robot traveled more than one kilometer in each experiment, and every trial was performed at different time of the day so we could capture the evolution of lighting conditions over the images. The tourist attendance also depends on the hour, providing datasets with a lot of examples to model into a social-way the different places such as corridors, gates, queues, groups of people, etc.

A data paper on these datasets is available here

Data Description

Each of the 10 datasets are stored separately in three files: one for sensors measurements, other for raw RGB images and a third one with the rectified grayscale images. All the datasets are logged and also processed using ROS tools, such as the ROS Bag. The different information stored into the logs also follows ROS interfaces and development main guidelines, so that the reader can understand easily the dataset with minimum ROS background.

The three log files per dataset contain the following information:

  • • Robot Odometry

  • • Laser Measurements

  • • Tranformations between Sensors

  • • Camera Info

  • • Raw RGB Image

  • • Rectified Image

System and Experiments Overview

The robot platform used for the datasets is a Pioneer 3AT with a simple aluminum structure to place the sensors and the computer:

  • • A stereo camera facing forward at 1.2m height.
  • • Two Hokuyo UTM-30LX placed parallel to the floor facing forward and backwards.
  • • A Hokuyo URG-04LX tilted 30º in front of the robot.
  • • Encoders in the robot base for odometry computation.

The datasets presented in this work represent a valuable set of images and range measurements that could be used to the comprehension of several human-robot behaviours or interactions.

The scenario considered in this work is a crowded mostly planar area. Tilted laser is aggregated to be used for small steps and ramps detection and determine by software if the robot will be able to cross the area or not. Images gathered with both left and right cameras make the dataset interesting for testing loop closure and kidnapping algorithms based in scene recognition. These datasets offer the possibility to study daylight variation in a mixed indoor outdoor structured environment and use this information for improving localization or augmenting accuracy. This dataset is also useful for testing algorithms for removing illumination effect in scenes.


The robot traveled 1899.88 m for 30:15s receiving 8170 images and 72644 range measurements (frontal laser).

Read More

The robot traveled 1820.02 for 29:10s receiving 7876 images and 70046 range measurements (frontal laser).

Read More

The robot traveled 1666.00 m for 26:42s receiving 7212 images and 64238 range measurements (frontal laser).

Read More

The robot traveled 1845.66 m for 29:41s receiving 8015 images and 71417 range measurements (frontal laser).

Read More

The robot traveled 1970.69 m for 31:43s receiving 8567 images and 76088 range measurements (frontal laser).

Read More

The robot traveled 1824.06 m for 29:09s receiving 7870 images and 70062 range measurements (frontal laser).

Read More

The robot traveled {\em 1569.47 m for 2516s receiving 6825 images and 60758 range measurements (frontal laser).

Read More

The robot traveled 1843.57 m for 29:29s receiving 7963 images and 70923 range measurements (frontal laser). Back laser crashed after 914s of execution.

Read More

The robot traveled 1894 m for 30:16s receiving 8176 images and 72833 range measurements (frontal laser).

Read More

The robot traveled 1864.08 m for 29:57s receiving 8087 images and 71945 range measurements (frontal laser).

Read More

Get in touch