Engineers from Purdue University’s Bio-Robotics Lab have designed a robotic hummingbird that approximates the flight of the real thing and does it with surprising accuracy. The tiny flying robot was ‘trained’ using machine learning algorithms using techniques that allow them to hover and change direction in mid-flight. The AI coupled with the robot’s wings enable it to navigate not by seeing its surroundings, but by touch, which alters an electric current when it comes into contact with a surface.
“The robot can essentially create a map without seeing its surroundings. This could be helpful in a situation when the robot might be searching for victims in a dark place — and it means one less sensor to add when we do give the robot the ability to see.” — Xinyan Deng, associate professor of mechanical engineering
The hummingbird robot was designed using a 3D-printed body, along with two direct-drive motors that control each wing, which can change their flapping motion at more than 30 times per second. The wings themselves are made of carbon fiber and laser-cut membranes, allowing them to be used as sensors for navigation, which utilizes the motors to detect changes in wing kinematics- thus mapping its surroundings by making contact with surfaces.
As it stands at this point, the robotic hummingbird is tethered to a power source as it flies, however, the engineers plan to update the bird with a rechargeable battery, GPS, camera and additional sensors. Beyond search and rescue operation, the engineers speculate the hummingbird could be used for covert operations, considering the robot makes the same amount of noise the real bird does. It may also help scientists study hummingbirds through the senses of a robotic counterpart. You can see even more on their GitHub page.