Produkte

Webseiten

Jumping

For years, ETH Zurich has been researching solutions in the field of robotics and intelligent systems capable of autonomous action in enclosed spaces. The university even has a specialized Autonomous Systems Lab for this purpose. An assortment of solutions have already been presented in several Master's and Bachelor's theses. One such example is the snake-like "traloc" search and rescue robot. But the trick to this research is not to sit around relying on past achievements, but rather to continue discovering different approaches. This then provides a basis for solutions that might solve the same problem better. So while the concept, defined as the "development of a robot with maximum mobility and maneuverability in indoor areas," might have had a familiar ring, it was also very new. The task requirements also included that the robot should be capable of, first, traversing steps, second, turning on the spot and, third, accessing tight spaces thanks to a compact design.

Nine students came together and founded a team. They called it Ascento, a reference to climbing stairs. As a technological basis, the team decided on a system balanced on two legs with wheels. This was intended to allow for a compact design and turning in tight spaces. One problem still remained: How can something climb stairs on two wheels? By jumping! That was easier said than done. This proved to be the greatest challenge in combination with the balancing system, as there are hardly any robots with a comparable configuration. As a result, the team was venturing into entirely new territory in terms of jumping dynamics.

Jumping alone was also not enough. In order to be able to move autonomously on unfamiliar terrain, the robot needs information about its environment, including the objects that are present. The team's plan called for a computer with connected sensors and high-end cameras. More specifically, two industrial cameras act as a stereo camera pair and record the surroundings in three dimensions. Since destroyed buildings during a disaster do not have maps available, the robot has to be capable of determining its position and creating a map-a classic chicken-and-egg problem.

This problem occurs more frequently in everyday life than one might think: robot mowers, robot vacuums, air surveillance with unmanned drones, rovers in space exploration, reef monitoring, mine exploration, etc. As such, many scientists have worked on this problem and developed the SLAM algorithm (Simultaneous Localization and Mapping) as an option for determining positions and creating maps simultaneously. SLAM, in combination with the cameras and an inertial measurement unit (IMU) for detecting motion, made it possible for the ETH Zurich students to pinpoint the robot in space based on visual data. In addition, the cameras plot the robot's surroundings in the form of sparse maps, a type of digital map, and record isolated, distinctive reference points in them. This in turn makes it possible to recognize routes that have already been traversed.

For the cameras, the students are using two mvBlueFOX-MLC200wG industrial cameras with a USB 2.0 interface from MATRIX VISION. Since these compact cameras took up hardly any space, they were easy to integrate. Using digital interfaces, it was possible to ensure simultaneous (stereo) recording from both cameras. A resolution of 752 x 480 pixels and frame rate of 93 frames per second together with 8 Mpixels of image memory ensured a sufficiently large field of view and reliable image acquisition without image loss.