Engineers from Italy and the UK have developed an algorithm for four-legged robots, allowing them to plan their steps based on data from visual sensors. Due to the fact that the algorithm works in real time, the robot can respond to shocks and other interventions that have arisen after the leg has been raised, the authors of the work , which will be presented at the IROS 2018 conference, tell.
Many four-legged robots, while walking, calculate an approximate trajectory, lower their legs to the surface blindly, and stop them when they find contact with the surface. This is sufficient in many cases, but, for example, in the presence of deep holes on the surface, the robot will simply fall into one of them, lowering the leg in anticipation of contact. More sophisticated robots that rely on visual data are protected from such situations, but almost all of them have another disadvantage – the calculation of the trajectory of the legs occurs before each step and this movement will be successful only if the robot does not encounter external disturbances during the step such as push to the side.
Claudio Semini (Claudio Semini) and his colleagues from the Italian Institute of Technology and the University of Oxford have developed a system that allows the robot to keep track of the location of obstacles on the way and the impact on it during the step, using only sensors and a computer.
The developers adapted the algorithm for the four-legged HyQ robot developed several years ago. His legs have three degrees of freedom and move thanks to hydraulic drives. It is equipped with a large number of sensors: a gyroscope, an accelerometer and sensors for the position of leg segments for collecting data about yourself, as well as a depth camera and a lidar for tracking the environment.
The data from these sensors is given to the motion planner, which includes a convolutional neural network, which in real time marks the relief in front of it into zones into which the robot can advance, and zones dangerous for steps. Since the delay in collecting data from the sensors is about a millisecond, and their processing by the neural network takes 0.1 milliseconds, the robot can not only plan steps in a normal environment, but also change its trajectory if it is pushed or pulled even when the foot is already falling to the surface.Recently, another group of engineers created a motion planning algorithm for a bipedal robot, allowing it to walk along obstacles of different heights, and apply for this quick, but more complex and unstable dynamic movements similar to those used by people.