Insect vision influences robotic detection and monitoring systems.
Most computer engineers are a bit scared of biology – but Flinders University robotics expert Associate Professor Russell Brinkworth embraces it. He has reverse engineered insect vision to create new capabilities in robotic detection and monitoring systems for security, defence and energy applications.
Associate Professor Brinkworth says the problem with robots right now is they can’t make sandwiches. Of course, lunch-making robots aren’t a priority for his Autonomous Systems research team at Flinders University. But the analogy of a robot constructing a meal is a useful one. Putting sandwiches together requires a way to see the world (vision), working with soft materials that come in irregular shapes performing complex physical manoeuvres and making decisions on the spot about how to proceed.
This is what makes constructing sandwiches more difficult than manufacturing automotives, since cars are rigid and the parts consistent. “I want to bring robots out of the lab and into the real world,” Associate Professor Brinkworth says. “This means creating robotic systems that can operate in a dynamic environment, that can be flexible and that can respond to what’s going on around them.”
For defence applications, for surveillance, for self-driving cars and for tracking real-world natural phenomena, Associate Professor Brinkworth and his colleagues are creating new technologies and robotic systems that are autonomous and adaptable.
Their recent work has focused on creating systems that can detect changes in the environment, such as scanning for unauthorised drones at airports and military sites. Current surveillance systems typically consist of high-grade cameras that collect visual information. Images are analysed by artificial intelligence trained to classify objects based on their appearance – distinguishing a drone from a bird, for example.
“But it’s really resource intensive and relatively slow to scan the entire environment at high resolution all the time,” Associate Professor Brinkworth says. “We’ve developed a system that operates much faster.” The key is an approach based on the biological reality of how eyes work. Low-grade vision is applied as a first-pass scanning tool (equivalent to peripheral vision that animals and insects use) and then the system shifts to higher resolution vision (known as a foveal vision by biologists) once something new is detected. “It’s a much more efficient way to do surveillance,” Associate Professor Brinkworth says.