Projects

 

INSERTION: Inspection and maiNtenance in harSh EnviRonments by mulTI-robot cooperatiON

Subproject: Robust Localization, Mapping and Planning in Harsh Environments (PID2021-127648OB-C31)

Inspection and Maintenance (I&M) robotics is a growing application area with great potential social and economical impact, in particular when considering hazardous and dangerous environments. Bringing robot automation to these environments will help to significantly reduce the risks associated with the operation, and to improve the working conditions. Indeed, there exist numerous I&M tasks that are performed periodically in both industry and service sectors that expose workers to serious risks, or that are tedious and developed in harsh conditions: sewers, off-shore Oil&Gas platforms, wind-turbines, gas/power transportation tunnels, mines, or industrial storage tanks are some good examples among many others.

While the deployment of autonomous robots for I&M has evolved significantly in the last decade, most of the advances have taken place in scenarios where robot localization can be estimated precisely, scenarios with well-known characteristics and geometry in which also robot autonomous operation is facilitated. However, there are many I&M tasks in several industry sectors that cannot be tackled using nowadays robotic technologies due to the harsh environmental conditions involved.  A harsh environment can be defined as a scenario that is challenging for agents to operate in. These include environments that are remote, unknown, cluttered, dynamic, unstructured and limited in visibility. This project focuses on application areas like off-shore platforms and windmills, large cargo ships, or construction sites. These applications are specially challenging due to their environment conditions, including fog, dust, rain, varying levels of illumination and/or insufficient texture, rough terrains, wind gusts, turbulence, and high degrees of clutter, like hanging cables, moving obstacles, human workers. In addition to dealing with harsh environments from the robot perception and control points of view, automatizing I&M tasks in these kinds of application scenarios would require a heterogeneous team of robots to cope with such complex and diverse environments: UGVs for floor-level tasks; UAVs for vertical, under-platform and facade inspection tasks; and USVs for sea-level inspection and team support. Furthermore, some tasks will require tight cooperation among robots like UAV taking-off and landing from USVs, or tethered UAVs to UGV/USV to perform long-term UAV I&M tasks. Thus, the main goal of the project is to advance the state of the art of the technologies required for the safe operation of teams of robots, comprising UAVs, UGVs and USVs, for I&M in harsh environments. The project will develop new techniques for localization in such low-visibility scenarios; navigation in these complex environments with clutter; manipulation/intervention; cooperation between UAVs, UGV, USVs; and control strategies for safe operation of UAVs and USVs in complex environments.

To this end, the project will coordinate three sub-projects and will exploit the synergies between the expertise on robot localization, mapping and planning of the Service Robotics Laboratory (SRL) of the Universidad Pablo de Olavide (UPO, coordinator); the expertise on UAV control and perception techniques of the Computer Vision and Aerial Robotics group (CVAR) of the Universidad Politécnica de Madrid (UPM); and the expertise on USV and robot coordination of the “Ingeniería de Sistemas, Control, Automatización y Robótica” (ISCAR) group of the Universidad Complutense de Madrid (UCM).

  • Running period: 2022-2025
  • Funding agency: Spanish Ministry of Science, Innovation and Universities

NHoA: Never Home Alone

The long-term vision of the Never Home Alone (NHoA) project is the development of a robotic system that helps elderly people to live independently in their homes and prevents loneliness and isolation. Loneliness is a direct consequence of this demographic shift and negatively affects the well-being and mental health of the elderly, leading to an increasing incidence of depression and social exclusion. Inspired by the scientific experience of caregivers, we aim to develop robots that mitigate loneliness by encouraging the contact within a related network, emulating the group therapies.
NHoA brings adequate multi-disciplinary expertise to tackle this challenge. The consortium combines partners with expertise in digital health and elder care together with scientists involved in social, affective, and human-aware robotics, and two companies with business and research experience in the use of robots for social and medical care.
The developments will be rooted on an i) iterative co-design involving the healthcare partners. Then, we will push the state of the art on key robot capabilities that are still in low readiness levels. In particular, we hypothesize that social intelligence is the key to the success on robustness and adaptability. Therefore, the project objectives aim to produce ii) novel multi-modal emotion recognition methods for an affective interaction; iii) robust contextual and semantic scene understanding in homes and iv) adaptive, legible, and emotionally expressive behaviours, considering both tabletop and mobile robots. The robot system will incorporate a v) novel cloud-based health monitoring and behaviour change recognition that interacts with the healthcare professionals. All these elements will be integrated by vi) a socially situated decision-making component to turn the robots into embodied social actors that will sense the social and emotional environment and proactively intervene to build an affective relationship with the user.

  • Running period: 2021-2024
  • Funding agency: Ministry of Science, Strategic Lines Projects
 

Research and Development for a Social Tabletop Robot

In the project, we are contributing to the development of the ROS-based supporting software architecture for a social tabletop robot. This architecture integrates robot drivers and sensor interfacing, motion control and high level decision making, multi-modal perception for social interaction, dialog and non-verbal communication and memory for long-term interaction. As part of the project, we are also carrying out research on social motion planning and control, including motion generation for robot expressivity, transferring motion from animation, social robot decision making and the application of the robot to different scenarios.

  • Running period: 2017-2023
  • Funding agency: Honda Research Institute Japan
 

DeepBot: Novel Perception and Navigation Techniques for Service Robots based on Deep Learning

The last decade has seen unprecedented advances in the development of machine learning techniques with the advent of so-called deep networks. The enormous potential of deep learning has not gone unnoticed in the field of robotics and, thus, it has recently been successfully applied, especially for tasks of perception of the environment.
However, deep learning is not exempt from problems in its application to robotics: the learning process requires large amounts of data and/or interactions to be able to converge to generalizable solutions, which in many cases is not feasible to obtain. Furthermore, in most cases, the available data does not consider the environments and conditions of robotic systems. Additionally, it is necessary that the approximations consider both dynamic and computational restrictions to which robotic systems are subject. In this sense, there are two promising lines that allow addressing these problems. On the one hand, the use of self-supervised approaches allows to significantly reduce the labeling work. On the other hand, the approach of new differentiable layers based on models, and capable of carrying out analytical operations known a priori, also allows reducing the requirements of labeled data and the training time due to the reduction of trainable parameters, as well as considering explicitly system constraints.
The DeepBot project is moving in this direction. The project pursues the idea of ​​obtaining mixed solutions that combine the best of the classic techniques of location, planning and perception of people, with the latest advances in deep learning to create solutions that go beyond the current state of the art in terms of precision, speed or generality, thus giving rise to new possible applications for service robots..

  • Running period: 2021-2023
  • Funding agency: Andalusian Regional Government
 

TELEPoRTA: Machine Learning Techniques for Assistive Telepresence Robots

The project focuses in the development of new technologies for telepresence robots based on machine learning, with special emphasis on assistive robotics applications. The main objective is to develop a semi-autonomous telepresence robot that allows a human-aware and natural interaction with people, as well as developing methods that allow the interaction through telepresence by cognitively impaired people through methods for automatic scene description and image captioning. In this way, the project considers the following subgoals: 1) New machine learning techniques for social robot navigation; 2) New cognitive computing techniques for perception of social robots; 3) Development of new semi-autonomous capabilities for telepresence assistive robots.

  • Running period: 2020-2022
  • Funding agency: Andalusian Regional Government
 

Collaboration with the Joint Research Centre, EU Commission, HUMAINT project

The HUMAINT project aims to provide a multidisciplinary understanding of the state of the art and future evolution of machine intelligence and its potential impact on human behaviour, with a focus on cognitive and socio-emotional capabilities and decision making. The project has three main goals: a) Advance the scientific understanding of machine and human intelligence; b) study the impact of algorithms on humans, focusing on cognitive and socio-emotional development and decision making; c) provide insights to policy makers with respect to the previous issues.

Our group collaborate with the core team in HUMAINT regarding robotics-related technologies, and in studies on child-robot interaction.

  • Running period: 2018-2023
  • Funding agency: European Commission
 

COMplex Coordinated Inspection and SEcurity missions by uavs in cooperation with ugv (COMCISE)

The COMCISE project proposes a smart multi-robot system for inspection in GPS-denied areas composed by a ground robot with enough computation and batteries, and a detachable tethered flying robot equipped with sensors for inspection and navigation. The main hypothesis is that such system offers the main benefits of both platforms: long inspection cycles and high manoeuvrability. UPO’s role will be centered on the cooperative planning, perception and navigation in the coordinated UAV-UGV system.
The specific objectives are:Precise multi-robot cooperative estimation, localization and mapping. The objective is to localize the UGV+UAV system in GPS-denied areas based on the local sensors installed in both vehicles. The approaches will take advantage of the synergies of both systems and also chance to relative estimate the position of one robot with respect the other. Multi-robot motion planning and control for tethered and untethered configurations. the objective is to develop new methods for motion planning and reactive collision avoidance of the whole UGV-UAV, taking into account the restrictions induced by the tether, communications, sensor payloads, etc. Cooperative planning and perception for efficient inspection applications: the objective is to develop new active sensing techniques for multi-robot systems for perception tasks, able to handle uncertainties and constraints and to improve the perception results.

  • Running period: 2019-2021
  • Funding agency: Spanish Ministry of Science, Innovation and Universities
 

NIx: ATEX Certifiable Navigation Module for Ground Robotic Inspection (ESMERA)

The NIx project proposes an ATEX certifiable navigation module for ground robotic inspection. The navigation module will be developed so that it can be easily integrated into different robot platforms. NIx development will be divided in two stages. Phase I will focus in the implementation and validation in real experiments of the whole localization and navigation stack. Phase II will focus in the industrialization of the localization and navigation system according with the ATEX certification requirements and on the usability of the system.

  • Running period: 2019-2020
  • Funding agency: European Commission, H2020
 

MBZIRC: Mohamed Bin Zayed International Robotics Challenge

Our lab will participate in the competition as part of the CVAR-UPM/ SRLab-UPO/ UAVRG-PUT team, with our colleagues from the Universidad Politecnica de Madrid and Poznan University of Technology.

  • Running period: 2018-2020
  • Funding agency: Khalifa University
 

ARCO: Autonomous Robot CO-worker (HORSE)

The objective of this project is the development and integration of a human-robot co-working system for warehouse picking applied to production line feeding. The proposed system will bring out the best in everyone; the extreme flexibility and adaptation of humans, and the safety and control of ground robots. This technology is particularly interesting for small and medium factories where fully robotized warehouses are not affordable or overkill, at the time that it allows optimizing the picking task with respect to manual operation.
The project is based on two main pillars to reach its objective: i) robust person detection and tracking for safe robot navigation: this new HORSE software component for detection and tracking of workers in the factory will allow the integration of ultra-wide-band localization systems, image-based people detection and LIDAR- based people detection to build a resilient software that exploits the synergies between different sensor modalities; ii) increase of AGV autonomy to perform autonomous navigation with people awareness in dynamic/changing scenarios.

  • Running period: 2018-2019
  • Funding agency: European Commission, H2020
fp7logo

SIAR: Sewer Inspection Autonomous Robot (ECHORD++)

The SIAR project will develop a fully autonomous ground robot able to autonomously navigate and inspect the sewage system with a minimal human intervention, and with the possibility of manually controlling the vehicle or the sensor payload when required.
The project uses as starting point IDMind’s robot platform RaposaNG. A new robot will be built based on this know-how, with the following key steps beyond the state of the art required to properly address the challenge: a robust IP67 robot frame designed to work in the hardest environmental conditions with increased power autonomy and flexible inspection capabilities; robust and increased communication capabilities; onboard autonomous navigation and inspection capabilities; usability and cost effectiveness of the developed solution. UPO leads the navigation tasks on the project

  • Running period: 2016-2018
  • Funding agency: European Commission, FP7

TERESA: Telepresence Reinforcement-Learning Social Agent

TERESA aims to develop a telepresence robot of unprecedented social intelligence, thereby helping to pave the way for the deployment of robots in settings such as homes, schools, and hospitals that require substantial human interaction. In telepresence systems, a human controller remotely interacts with people by guiding a remotely located robot, allowing the controller to be more physically present than with standard teleconferencing. We will develop a new telepresence system that frees the controller from low-level decisions regarding navigation and body pose in social settings. Instead, TERESA will have the social intelligence to perform these functions automatically.The project’s main result will be a new partially autonomous telepresence system with the capacity to make socially intelligent low-level decisions for the controller. Sometimes this requires mimicking the human controller (e.g., nodding the head) by translating human behavior to a form suitable for a robot. Other times, it requires generating novel behaviors (e.g., turning to look at the speaker) expected of a mobile robot but not exhibited by a stationary controller. TERESA will semi-autonomously navigate among groups, maintain face-to-face contact during conversations, and display appropriate body-pose behavior.

Achieving these goals requires advancing the state of the art in cognitive robotic systems. The project will not only generate new insights into socially normative robot behavior, it will produce new algorithms for interpreting social behavior, navigating in human-inhabited environments, and controlling body poses in a socially intelligent way.

The project culminates in the deployment of TERESA in an elderly day center. Because such day centers are a primary social outlet, many people become isolated when they cannot travel to them, e.g., due to illness. TERESA’s socially intelligent telepresence capabilities will enable them to continue social participation remotely.

UPO leads WP4 within TERESA, “Socially Navigating the Environment”, that deals with the development of the navigation stack (including localization, path planning and execution) required to enable enhanced, socially normative, autonomous navigation of the telepresence robot.

  • Running period: 2013-2016
  • Funding agency: European Commission, FP7
 2011-Web-EconomiaC-63px

OCELLIMAV: New sensing and navigation systems based on Drosophilas OCELLI for Micro Aerial Vehicles

Micro unmanned Aerial Vehicles (MAVs) may open up a new plethora of applications for aerial robotics, both in indoor and outdoors scenarios. However, the limited payload of these vehicles limits the sensors and processing power that can be carried by MAVs, and, thus, the level of autonomy they can achieve without relying on external sensing and processing. Flying insects, like Drosophila, on the other hand, can carry out impressive maneuvers with a relatively small neural system. This project will explore the biological fundamentals of Drosophilas ocelli sensory-motor system, one of the mechanisms most likely related to fly stabilization, and the possibility to derive new sensing and navigation systems for MAVs from it.The project will do it by a complete reverse engineering of the ocelli system, estimating the structure and functionality of its neural processing network, and then modeling it through the interaction of biological and engineering research. Novel genetic-based neural tracing methods will be employed to extract the topology of the neural network, and behavioral experiments will be devised to determine the functionalities of specific neurons. The findings will be used to derive a model, and this model will be used to characterize the relevant aspects from the point of view of estimation and control. The model will also serve to determine the adaptation of this sensory-motor system to current MAV platforms, and to design of a proof of concept sensing and navigation system.

The expected results of the project are two-fold: to corroborate or challenge current assumptions on the use of ocelli in flies; and the creation of a proof of concept device with the findings of the project for a MAV system.

  • Running period: 2015-2018
  • Funding agency: Spanish Ministry of Economy
 Logo-JUNTA_ANDALUCIA-kBLm8j

PAIS-MultiRobot: Perception and Action under Uncertainties in Multi-Robot Systems

The project main objective is the development of efficient methods for dealing with uncertainties in robotic systems, and, in particular, teams of multiple robots.One of the objectives is the development of an scalable cooperative perception system able to reason on the uncertainties associated to the sensing system in the multi-robot platform. The efficient application of online Partially Observable Markov Decision Processes (POMDPs) to this problem will be analyzed.

In the project, we aim to demonstrate the in real robotic systems for applications like surveillance.

  • Running period: 2013-2016
  • Funding agency: Andalusian Regional Government

EC-SAFEMOBIL: Estimation and control for safe wireless high mobility cooperative industrial systems

Autonomous systems and unmanned aerial vehicles (UAVs), can play an important role in many applications including disaster management, and the monitoring and measurement of events, such as the volcano ash cloud of April 2010. Currently, many missions cannot be accomplished or involve a high level of risk for the people involved (pilots and drivers), as unmanned vehicles are not available or not permitted. This also applies to search and rescue missions, particularly in stormy conditions, where pilots need to risk their lives. These missions could be performed or facilitated by using autonomous helicopters with accurate positioning and the ability to land on mobile platforms such as ship decks. These applications strongly depend on the UAV reliability to react in a predictable and controllable manner in spite of perturbations, such as wind gusts. On the other hand, the cooperation, coordination and traffic control of many mobile entities are relevant issues for applications such as automation of industrial warehousing, surveillance by using aerial and ground vehicles, and transportation systems. EC-SAFEMOBIL is devoted to the development of sufficiently accurate common motion estimation and control methods and technologies in order to reach levels of reliability and safety to facilitate unmanned vehicle deployment in a broad range of applications. It also includes the development of a secure architecture and the middleware to support the implementation. Two different kind of applications are included in the project:

  • Very accurate coupled motion control of two mobile entities. The technologies will be demonstrated in two challenging applications dealing with the landing on mobile platforms and launching of unmanned aerial vehicles from a manned vehicle.
  • Distributed safe reliable cooperation and coordination of many high mobility entities. The aim is to precisely control hundreds of entities efficiently and reliably and to certify developed techniques to support the exploitation of unmanned platforms in non-restricted areas. This development will be validated in two scenarios: industrial warehousing involving a large number of autonomous vehicles and surveillance also involving many mobile entities.

UPO’s team is involved through a subcontracting, dealing with the development of multi-vehicle planning under uncertainties and decentralized data fusion algorithms, leaded by Luis Merino. In addition, we developed novel strategies for automatic control of rotary-wing helicopters in landing operations. This work was leaded by Manuel Béjar.

  • Running period: 2011-2015
  • Funding agency: European Commission, FP7

FROG: Fun Robotic Outdoor Guide

FROG proposes to develop a guide robot with a winning personality and behaviours that will engage tourists in a fun exploration of outdoor attractions. The work encompasses innovation in the areas of vision-based detection, robotics design and navigation, human-robot interaction, affective computing, intelligent agent architecture and dependable autonomous outdoor robot operation.The FROG robots fun personality and social visitor-guide behaviours aim to enhance the user experience. FROGs behaviours will be designed based on the findings of systematic social behavioural studies of human interaction with robots. FROG adapts its behaviour to the users through vision-based detection of human engagement and interest. Interactive augmented reality overlay capabilities in the body of the robot will enhance the visitors experience and increase knowledge transfer as information is offered through multi-sensory interaction. Gesture detection capabilities further allow the visitors to manipulate the augmented reality interface to explore specific interests.

The is a unique project in respect to Human Robot Interaction in that it considers the development of a robot’s personality and behaviours to engage the users and optimise the user experience. We will design and develop those robot behaviour’s that complement a guide robot’s personality so that users experience and engage with the robot truly as a guide.

We lead WP2 within FROG, dealing with robot localization and social navigation. We are developing algorithms for robust localization and navigation in outdoors scenarios. Furthermore, we are working on models and tools for social navigation.

  • Running period: 2011-2014
  • Funding agency: European Commission, FP7

CONET: Cooperating Objects Network of Excelence

The vision of Cooperating Objects is relatively new and needs to be understood in more detail and extended with inputs from the relevant individual communities that compose it. This will enable us to better understand the impact on the research landscape and to steer the available resources in a meaningful way.The main goal of CONET is to build a strong community in the area of Cooperating Objects capable of conducting the needed research to achieve the vision of Mark Weiser.

Therefore, the CONET Project Objectives are the following:

  • Create a visible and integrated community of researchers on the topics related to Cooperating Objects capable of driving the domain in the coming years.
  • Identify, arise awareness and steer academic research efforts towards industry-relevant issues without forgetting fundamental scientific issues; make the community more reactive to novel issues and approaches, and to coordinate its efforts; establish tight relationships with the European industry, leveraging interactions with leading US institutions in the field.
  • Stimulate cooperation between researchers in order to achieve a lasting and sustainable architecture that is able to cope with the vision of Cooperating Objects.

Within this project, we are working on the research cluster on Mobility of Cooperating Objects, mainly in the tasks of multi-robot planning under uncertainty and aerial objects coordination and control.

  • Running period: 2008-2012
  • Funding agency: European Commission, FP7

URUS: Ubiquitous networking Robotics in Urban Settings

The URUS project focussed in designing a network of robots that in a cooperative way interact with human beings and the environment for tasks of assistance, transportation of goods, and surveillance in urban areas. Specifically, the objective of the project was to design and develop a cognitive network robot architecture that integrates cooperating urban robots, intelligent sensors, intelligent devices and communications.Among the specific technology that has been developed in the project, it can be found: navigation coordination; cooperative perception; cooperative map building; task negotiation; human robot interaction; and wireless communication strategies between users (mobile phones), the environment (cameras), and the robots. Moreover, in order to make easy the tasks in the urban environment, commercial platforms that have been specifically designed to navigate and assist humans in such urban settings will be given autonomous mobility capabilities.

Proof-of concept tests of the systems developed took place in the UPC campus, a car free area of Barcelona.

The participation of the UPO in the project was two-fold. In one hand, we developed the navigation algorithms for our robot Romeo, to allow it to navigate safely in a pedestrian environment.

On the other hand, we participated in the development of a decentralized fusion system for collaborative person tracking and estimation using all the elements of the system (robots, camera network and sensor network).

  • Running period: 2006-2009
  • Funding agency: European Commission, FP6

COMETS: Real-Time Coordination and Control of Multiple Heterogeneous Unmanned Aerial Vehicles

The main objective of COMETS is to design and implement a distributed control system for cooperative activities using heterogeneous Unmanned Aerial Vehicles (UAVs). Particularly, both helicopters and airships are included.Technologies involved in COMETS Project:

  • Arquitecture and techniques for real-time coordination and control.
  • Helicopter and airship autonomous control.
  • Cooperative Environment Perception: detection and monitoring perception tools, cooperative terrain mapping,…

A key aspect in this project is the experimentation: local UAV experiments and general multi-UAV demonstrations.

Fernando Caballero and Luis Merino were directly involved in the project.

  • Running period: 2002-2005
  • Funding agency: European Commission, FP5