You are here
You and your family are standing in the middle of a crowded foreign airport. You are late for your connecting flight and are staring at the endless rows of gate numbers on a monitor, when a tall robot with an E.T.-like head approaches and invites you to scan your boarding card on its chest. The robot then asks you to follow it as it guides you and your family through the airport halls to the correct gate.
You have just met SPENCER, a prototype that is the result of 3 years of research by a EU-funded consortium1 including the CNRS and several partners from Germany, Sweden, Switzerland, and the Netherlands.2 The main objective of the “Social situation-aware perception and action for cognitive robots” (SPENCER) project is just that—to develop a robot that helps passengers in transit make their connecting flights on time. Two companies are part of the project: BlueBotics, a guide robot developer, and the Dutch airline KLM, which has just tested SPENCER with real passengers last month at Amsterdam’s Schiphol airport, Europe’s fifth busiest airport—handling more than 58 million travelers in 2015.
Partnering up
Each research partner was assigned specific technologies to develop for the project. BlueBotics had to create a robot with several constraints—including autonomy, since distance between gates can be over 800 meters. Researchers from the University of Freiburg, who coordinated the entire project, focused on how the robot could perceive human beings—and even detect groups—using lasers, and several Kinect cameras. The team from Örebro University focused on mapping software, which is very challenging in a dynamic environment where many people are moving, constantly reconfiguring the space around the robot.
The CNRS was responsible for the decisional process (the so-called “supervisor” software that controls a robot) and movement. “And this decisional process is essential for avoiding obstacles,” explains CNRS project coordinator Rachid Alami from the LAAS,3 who also worked with Isir4 researchers Raja Chatila and Mohamed Chetouani. “There was a real and new challenge in dealing with human beings. You don’t avoid a person the same way you avoid an inanimate object like a garbage can.” For this, the team developed a technology that detects people, understands in which direction they are moving, and where they are looking, to help the robot stay at a given distance. “This is what we called Human Aware Motion Planning. When we approach someone, we want to make sure that person can see us, otherwise, we may scare them,” explains Alami. SPENCER therefore tries to stay at a distance, but if it needs to approach someone, it will slow down and try to place itself in their field of vision. The researchers were also able to take this into account when planning trajectories, by optimizing them not on distance, or to save energy, but to cause as little discomfort to surrounding people as possible.
Details in the signals
“What is also novel is that we were able to create a robot that adapts its speed to the people being guided—slowing down, or speeding up,” says Alami, stressing that if SPENCER knows that it is late to the gate, it will slightly increase its speed to nudge the passenger to do the same. Yet one of the most surprising aspects has to do with SPENCER’s head. “In the same way humans do, we used the robot’s head movements to transmit relevant information.” Not only does SPENCER turn its head when anticipating a turn in a specific direction, much like a car’s turning signal, but it also looks straight at people who are close by or walking towards it, to communicate that they have been seen. “The robot has a lots of on-board cameras, it doesn’t have to turn its head at all, so this is really for us, humans, to let us know that it has seen us and will avoid us, or stop...” adds Alami.
Learning by example
The last aspect of the team’s research used know-how from the Isir, in order to teach SPENCER how to approach a group of people. “An individual is easy, but a group is quite complex. Where does a group start, where does it end in the midst of a crowd? How can I interact with it, how can I physically enter that group and become part of it?” explains Mohamed Chetouani, who helped collect data and model human behavior—using motion capture—to teach the robot this crucial aspect of human interaction. “This is too complex to be written as rules, and this why robots need to be able to learn it,” he adds.
One of the greatest challenges of a project of this scale was integrating the specific technological ‘bricks’ of each partner, something that was done at the Toulouse-based LAAS. An experimental room was created at the laboratory to welcome all the partners from their respective countries—and on several occasions—to give them the opportunity to interact and see how their modules worked together. “You eventually need global intelligence of the entire system and at the LAAS, we had the means and the expertise to complete software integration for the robot,” Alami concludes.
Boarding now
As the travel industry warms up to using robots (from hotel reception desks to information guides on boat cruises), SPENCER will not be alone when—following a final green light from the project’s managers—it is eventually commercialized and finds its way to an airport, train station, or mall near you.
- 1. This FP7 project is funded to the tune of €4 million. http://www.spencer.eu
- 2. Albert-Ludwigs-Universität Freiburg, Technische Universität München, Rheinisch-Westfälische Technische Hochschule Aachen (Germany); Örebro University (Sweden), CNRS (France); KLM Royal Dutch Airlines, Universiteit Twente Enschede (Netherlands); BlueBotics SA (Switzerland).
- 3. Laboratoire d'analyse et d'architecture des systèmes (CNRS).
- 4. Institut des systèmes intelligents et de robotique (UPMC / CNRS).
Comments
Log in, join the CNRS News community