Here you can find some relevant projects in which I am or have been involved.

Currently Funded Projects as PI


SIAR: Sewer Inspection Autonomous Robot (ECHORD++)

The SIAR project will develop a fully autonomous ground robot able to autonomously navigate and inspect the sewage system with a minimal human intervention, and with the possibility of manually controlling the vehicle or the sensor payload when required. The project uses as starting point IDMind's robot platform RaposaNG. A new robot will be built based on this know-how, with the following key steps beyond the state of the art required to properly address the challenge: a robust IP67 robot frame designed to work in the hardest environmental conditions with increased power autonomy and flexible inspection capabilities; robust and increased communication capabilities; onboard autonomous navigation and inspection capabilities; usability and cost effectiveness of the developed solution.

UPO leads the navigation tasks on the project

  • Running period: 2016-2018
  • Funding agency: European Commission, FP7


TERESA: Telepresence Reinforcement-Learning Social Agent

TERESA aims to develop a telepresence robot of unprecedented social intelligence, thereby helping to pave the way for the deployment of robots in settings such as homes, schools, and hospitals that require substantial human interaction. In telepresence systems, a human controller remotely interacts with people by guiding a remotely located robot, allowing the controller to be more physically present than with standard teleconferencing. We will develop a new telepresence system that frees the controller from low-level decisions regarding navigation and body pose in social settings. Instead, TERESA will have the social intelligence to perform these functions automatically.

The project’s main result will be a new partially autonomous telepresence system with the capacity to make socially intelligent low-level decisions for the controller. Sometimes this requires mimicking the human controller (e.g., nodding the head) by translating human behavior to a form suitable for a robot. Other times, it requires generating novel behaviors (e.g., turning to look at the speaker) expected of a mobile robot but not exhibited by a stationary controller. TERESA will semi-autonomously navigate among groups, maintain face-to-face contact during conversations, and display appropriate body-pose behavior.

Achieving these goals requires advancing the state of the art in cognitive robotic systems. The project will not only generate new insights into socially normative robot behavior, it will produce new algorithms for interpreting social behavior, navigating in human-inhabited environments, and controlling body poses in a socially intelligent way.

The project culminates in the deployment of TERESA in an elderly day center. Because such day centers are a primary social outlet, many people become isolated when they cannot travel to them, e.g., due to illness. TERESA’s socially intelligent telepresence capabilities will enable them to continue social participation remotely.

I lead WP4 within TERESA, "Socially Navigating the Environment", that deals with the development of the navigation stack (including localization, path planning and execution) required to enable enhanced, socially normative, autonomous navigation of the telepresence robot.

  • Running period: 2013-2016
  • Funding agency: European Commission, FP7

OCELLIMAV: New sensing and navigation systems based on Drosophilas OCELLI for Micro Aerial Vehicles

Micro unmanned Aerial Vehicles (MAVs) may open up a new plethora of applications for aerial robotics, both in indoor and outdoors scenarios. However, the limited payload of these vehicles limits the sensors and processing power that can be carried by MAVs, and, thus, the level of autonomy they can achieve without relying on external sensing and processing. Flying insects, like Drosophila, on the other hand, can carry out impressive maneuvers with a relatively small neural system. This project will explore the biological fundamentals of Drosophilas ocelli sensory-motor system, one of the mechanisms most likely related to fly stabilization, and the possibility to derive new sensing and navigation systems for MAVs from it.

The project will do it by a complete reverse engineering of the ocelli system, estimating the structure and functionality of its neural processing network, and then modeling it through the interaction of biological and engineering research. Novel genetic-based neural tracing methods will be employed to extract the topology of the neural network, and behavioral experiments will be devised to determine the functionalities of specific neurons. The findings will be used to derive a model, and this model will be used to characterize the relevant aspects from the point of view of estimation and control. The model will also serve to determine the adaptation of this sensory-motor system to current MAV platforms, and to design of a proof of concept sensing and navigation system.

The expected results of the project are two-fold: to corroborate or challenge current assumptions on the use of ocelli in flies; and the creation of a proof of concept device with the findings of the project for a MAV system.

  • Running period: 2015-2017
  • Funding agency: Spanish Ministry of Economy

PAIS-MultiRobot: Perception and Action under Uncertainties in Multi-Robot Systems

The project main objective is the development of efficient methods for dealing with uncertainties in robotic systems, and, in particular, teams of multiple robots. One of the objectives is the development of an scalable cooperative perception system able to reason on the uncertainties associated to the sensing system in the multi-robot platform. The efficient application of online Partially Observable Markov Decision Processes (POMDPs) to this problem will be analyzed. In the project, we aim to demonstrate the in real robotic systems for applications like surveillance.

  • Running period: 2013-2016
  • Funding agency: Andalusian Regional Government

Other Current Projects

EC-SAFEMOBIL: Estimation and control for safe wireless high mobility cooperative industrial systems

Autonomous systems and unmanned aerial vehicles (UAVs), can play an important role in many applications including disaster management, and the monitoring and measurement of events, such as the volcano ash cloud of April 2010. Currently, many missions cannot be accomplished or involve a high level of risk for the people involved (pilots and drivers), as unmanned vehicles are not available or not permitted. This also applies to search and rescue missions, particularly in stormy conditions, where pilots need to risk their lives. These missions could be performed or facilitated by using autonomous helicopters with accurate positioning and the ability to land on mobile platforms such as ship decks. These applications strongly depend on the UAV reliability to react in a predictable and controllable manner in spite of perturbations, such as wind gusts. On the other hand, the cooperation, coordination and traffic control of many mobile entities are relevant issues for applications such as automation of industrial warehousing, surveillance by using aerial and ground vehicles, and transportation systems. EC-SAFEMOBIL is devoted to the development of sufficiently accurate common motion estimation and control methods and technologies in order to reach levels of reliability and safety to facilitate unmanned vehicle deployment in a broad range of applications. It also includes the development of a secure architecture and the middleware to support the implementation. Two different kind of applications are included in the project:

  • Very accurate coupled motion control of two mobile entities. The technologies will be demonstrated in two challenging applications dealing with the landing on mobile platforms and launching of unmanned aerial vehicles from a manned vehicle.
  • Distributed safe reliable cooperation and coordination of many high mobility entities. The aim is to precisely control hundreds of entities efficiently and reliably and to certify developed techniques to support the exploitation of unmanned platforms in non-restricted areas. This development will be validated in two scenarios: industrial warehousing involving a large number of autonomous vehicles and surveillance also involving many mobile entities.
UPO's team is involved through a subcontracting, dealing with the development of multi-vehicle planning under uncertainties and decentralized data fusion algorithms, as well as control of rotary-wing helicopters for landing (the latter work carried out by my colleague Manuel Bejar).

  • Running period: 2011-2015
  • Funding agency: European Commission, FP7

Some Past Projects


FROG: Fun Robotic Outdoor Guide

FROG proposes to develop a guide robot with a winning personality and behaviours that will engage tourists in a fun exploration of outdoor attractions. The work encompasses innovation in the areas of vision-based detection, robotics design and navigation, human-robot interaction, affective computing, intelligent agent architecture and dependable autonomous outdoor robot operation.

The FROG robots fun personality and social visitor-guide behaviours aim to enhance the user experience. FROGs behaviours will be designed based on the findings of systematic social behavioural studies of human interaction with robots. FROG adapts its behaviour to the users through vision-based detection of human engagement and interest. Interactive augmented reality overlay capabilities in the body of the robot will enhance the visitors experience and increase knowledge transfer as information is offered through multi-sensory interaction. Gesture detection capabilities further allow the visitors to manipulate the augmented reality interface to explore specific interests.

The is a unique project in respect to Human Robot Interaction in that it considers the development of a robot’s personality and behaviours to engage the users and optimise the user experience. We will design and develop those robot behaviour’s that complement a guide robot’s personality so that users experience and engage with the robot truly as a guide.

I lead WP2 within FROG, dealing with robot localization and social navigation. We are developing algorithms for robust localization and navigation in outdoors scenarios. Furthermore, we are working on models and tools for social navigation.

  • Running period: 2011-2014
  • Funding agency: European Commission, FP7

CONET Network of Excellence

CONET: Cooperating Objects Network of Excelence

The vision of Cooperating Objects is relatively new and needs to be understood in more detail and extended with inputs from the relevant individual communities that compose it. This will enable us to better understand the impact on the research landscape and to steer the available resources in a meaningful way.

The main goal of CONET is to build a strong community in the area of Cooperating Objects capable of conducting the needed research to achieve the vision of Mark Weiser.

Therefore, the CONET Project Objectives are the following:

  • Create a visible and integrated community of researchers on the topics related to Cooperating Objects capable of driving the domain in the coming years.
  • Identify, arise awareness and steer academic research efforts towards industry-relevant issues without forgetting fundamental scientific issues; make the community more reactive to novel issues and approaches, and to coordinate its efforts; establish tight relationships with the European industry, leveraging interactions with leading US institutions in the field.
  • Stimulate cooperation between researchers in order to achieve a lasting and sustainable architecture that is able to cope with the vision of Cooperating Objects.
I lead the work of UPO in CONET (Associated Member of the Network). We are working on the research cluster on Mobility of Cooperating Objects, mainly in the tasks of multi-robot planning under uncertainty and aerial objects coordination and control.

  • Running period: 2008-2012
  • Funding agency: European Commission, FP7


URUS: Ubiquitous networking Robotics in Urban Settings

The URUS project focussed in designing a network of robots that in a cooperative way interact with human beings and the environment for tasks of assistance, transportation of goods, and surveillance in urban areas. Specifically, the objective of the project was to design and develop a cognitive network robot architecture that integrates cooperating urban robots, intelligent sensors, intelligent devices and communications.

Among the specific technology that has been developed in the project, it can be found: navigation coordination; cooperative perception; cooperative map building; task negotiation; human robot interaction; and wireless communication strategies between users (mobile phones), the environment (cameras), and the robots. Moreover, in order to make easy the tasks in the urban environment, commercial platforms that have been specifically designed to navigate and assist humans in such urban settings will be given autonomous mobility capabilities.

Proof-of concept tests of the systems developed took place in the UPC campus, a car free area of Barcelona.

My participation in the project was two-fold. In one hand, we developed the navigation algorithms for our robot Romeo, to allow it to navigate safely in a pedestrian environment. Some videos and results can be found at

On the other hand, I participated in the development of a decentralized fusion system for collaborative person tracking and estimation using all the elements of the system (robots, camera network and sensor network).

  • Running period: 2006-2009
  • Funding agency: European Commission, FP6


COMETS: Real-Time Coordination and Control of Multiple Heterogeneous Unmanned Aerial Vehicles

The main objective of COMETS is to design and implement a distributed control system for cooperative activities using heterogeneous Unmanned Aerial Vehicles (UAVs). Particularly, both helicopters and airships are included.
Technologies involved in COMETS Project:

  • Arquitecture and techniques for real-time coordination and control.
  • Helicopter and airship autonomous control.
  • Cooperative Environment Perception: detection and monitoring perception tools, cooperative terrain mapping,…

A key aspect in this project is the experimentation: local UAV experiments and general multi-UAV demonstrations. I was directly involved in the project, where I mainly developed my thesis.
  • Running period: 2002-2005
  • Funding agency: European Commission, FP5


SPREAD provides a framework for the development and implementation of an integrated forest fire management system for Europe. It will develop an end-to-end solution with inputs from Earth observation and meteorological data, information on the human dimension of fire risk, and assimilation of these data in fire prevention and fire behaviour models.
It will provide new tools for fire management, in close collaboration with regional and national forest agencies, and new approaches to post-fire landscape management.
The role of our group is the development of vision-based functions for fire detection and monitoring, using visual and infrared images gathered by fixed and aerial cameras.