A bio-inspired robot is under development at the University of Adelaide that has insect vision. In hopes of improving robot visual systems, researchers have applied the way insects see and track their prey. Insects have this amazing ability to detect and follow small objects against complex backgrounds, which is no easy task.
In a new paper published in the Journal of The Royal Society Interface, researchers describe how the abilities of both insects and humans can be applied in a model virtual reality simulation, allowing artificial intelligence to literally ‘pursue’ an object. Lead author of the study, Mechanical Engineering PhD Student Zahra Begheri, explains the human connection.
“Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd – all while running or even diving towards the point where they predict it will fall… Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill,” she said.
Dragonflies have excellent vision, making them the key insect for this project. They have the ability to chase mates or prey in the presence of distractions, like swarms of insects. They can do this despite their low visual acuity and tiny brain. According to Bagheri, the dragonfly chases prey at speeds of up to 60 km/h, capturing them with a success rate of over 97%.
How do you convince a robot to view the world like a dragonfly?
A team of neuroscientists and engineers have developed a unique algorithm to emulate the visual tracking system found in flying insects. Instead of trying to center the target in the robots field of view, this “active vision” system locks on to the background and waits for the target to move against it. This keeps the background from being a big distraction and gives the robot time to adjust its gaze, rotating towards the target, keeping it front and center.
Dr Steven Wiederman, who is leading the project, is currently transferring the algorithm to a hardware platform… a bio-inspired, autonomous robot. DUH DUH DUH!