Cosmos » Engineering
Dr Richard Musgrove is a freelance science and business writer based in Far North Queensland. He has been a Senior Researcher at the South Australian Research and Development Institute, and Operations Manager at Northern Gulf Resource Management Group, in Far North Queensland. His career has taken him to deep-sea trawlers in the Great Australian Bight, Tokyo’s Tsukiji Fish Market, and the Memorial University of Newfoundland, Canada, as well as cattle stations in FNQ’s Dry Tropics and the stage at South Australian Science Communicators.
Flies collect information 4 times faster than we do, our vain fly-swatting efforts appearing in slow motion to that fly on the salad, albeit at lower resolution.
“Although they operate faster, and at lower resolution, they’re still able to do a huge variety of tasks much better than anything that our current systems can,” says Professor Russell Brinkworth of Flinders University, Adelaide.
“Understanding how they do it can help us to build better machines, that use insect features to sense and more efficiently navigate natural and built environments,” he says.
We’ve all seen insect eyes portrayed in movies (The Fly, Monsters Vs Aliens). Bizarrely beautiful, almost other-worldly, such compound eyes are the oldest and most dominant vision system on Earth, used by 75% of all animals, including 10 million species of insects. And Australian robotics researchers are tapping this well of visual expertise to make smarter machines. 
We can read black writing on white paper in the sun and in the shade because our eyes adapt — something we take for granted. As can insects — not read, as far as we know — but certainly perceive contrast under different lighting conditions. “All biological eyes can do this” says Brinkworth, but to a camera, writing disappears in the glare or into the dark.”
Compound eyes are the oldest and most dominant vision system on Earth.
Until now, even small insects outshine most cameras in these tasks, which is why Brinkworth and his students are using insect-eye models to make camera systems that recognise subtle differences and small contrast changes, allowing us to decipher our complex environments. That’s the key, not trying to capture the perfect image, says Brinkworth.
Hoverfly eyes, and what they do with them, focus Brinkworth’s attention. Adult hoverflies are nectar and pollen feeders, and pollinators. Small and often colourful, you’ll see them floating through gardens around the world. Their eyes may be more than 20% of their body’s mass — imagine having eyes the size of watermelons!
Consummate hoverers and acrobats, these little flies understand their world through optic flow — the picture of the hoverfly’s world moving across its eye as it passes through it. This is not detection per se, but the flow of information past the eye, says Brinkworth. Relative velocity and position — and time to impact — are critical.
“Optic flow is effectively the ratio of speed to distance,” says Garrett. The closer the object, the faster it appears to move, relative to you. Imagine you travelling to Byron Bay in your car. Look out the window to the side — that kangaroo grazing on the side of the road is ‘approaching’ faster than the semi-trailer raising dust on the side-road in the distance. Hoverflies estimate where objects are through optic flow — relatively fast or getting faster means less time to impact.
We’re using the animals to inform robotics and using the robotics to better understand animals.
Brinkworth studies hoverfly vision to make better sensors for detecting fine-scale changes in the environment, such as unauthorised drones at airports and military sites. Trials at Woomera in South Australia, showed prototypes could “spot incoming objects on a direct collision course coming immediately over the horizon directly towards the camera, when they’re smaller than a single pixel,” says Brinkworth. “From the ground or from a drone,” he adds.
Achieved by reverse-engineering hoverfly abilities, they built cameras able to detect objects camouflaged against messy backgrounds — “slight, subtle lighting and contrast differences against different backgrounds and combinations.”
Small contrast differences were amplified, and movement and lighting changes rapidly detected. Essentially, they were able to separate the signal they wanted, from the noise they didn’t.
Your peripheral vision is unfocussed, like an insect’s, but was enough to save your life when you started to cross that road and suddenly sensed a car coming, out of the ‘corner’ of your eye — you stepped back without thinking or focussing. Brinkworth’s technology switches from this low resolution, but really fast, hoverfly-like vision, to focussed (‘foveal’ in biology) mode, to get as clear a picture of the object as possible — using acoustic or optical (i.e. camera) sensors (including infra-red) or both. Camera frame rates are 50-100 frames per second.
Similarly, collaborators Professor Matt Garrett, Dr Sridhar Ravi of UNSW Canberra and Professor Mandyam Srinivasan of UQ are using the honey bee’s ability to visually navigate complex environments to develop autonomous miniature drones for use in precision agriculture, search and rescue, wildlife monitoring and war zones.
“Honey bees are excellent long-distance flyers and can be trained for experiments”, says Ravi. “Studying their responses to environmental manipulations allows us to better understand their vision systems.”
Applying honey bee skills to miniature drones involves working out “how bees solve the problem of navigating in completely new environments, said Ravi, then: “What does that look like from a sensor standpoint and how do the algorithms work”. “This could be applied to a whole suite of other platforms, not just miniature drones,” he said.
Flies collect information 4 times faster than we do.
The project follows more than 20 years of research, says Garrett. From getting drones to take off, hover and land using visual sensors alone — no lasers or GPS — to now moving forward through an obstacle course. Current hypotheses are tested using a 2kg eight-rotor drone, paired with honey bee experiments. Each step builds on the last and gets the team closer to its goal. “It’s two-way communication — we’re using the animals to inform robotics and using the robotics to better understand animals. So, we’re hoping for that symbiotic transfer of knowledge,” says Ravi
The miniature drone is fitted with a panoramic (360o) vision system to provide the broad field of view so important to insects’ flight control. Pan-tilt capability provides stability and a second source of optic flow, enabling the drone to move in a straight line, up and down and left and right — providing the versatility and stability of movement needed to take on the results of the honey bee trials.
The combination of miniaturisation and desired applications provides many challenges, including navigation. GPS is ubiquitous but the interest is in environments where GPS doesn’t work well — indoors or in a forest — or in war zones, where it can be jammed, says Garrett. Lasers can also be detected, are heavy and emit radiation. So, the miniature drone’s navigation must rely on a passive, non-GPS, radiation-free system which leaves optic flow. Interestingly, NASA’s Mars helicopter ‘Ingenuity’ uses such vision sensors for stabilisation, says Ravi. 
Once the test drone flies as it should — relying entirely on vision sensors — the miniaturisation challenge will include the panoramic imaging and pan-tilt systems, with electronics possibly taking the place of the latter physical system.
Originally published by Cosmos as Flies and robot eyes
Please login to favourite this article.
‘Cosmos’ and ‘The Science of Everything’ are registered trademarks in Australia and the USA, and owned by The Royal Institution of Australia Inc.
T: 08 7120 8600 (Australia)
+61 8 7120 8600 (International)
Customer Service
9:00 am — 5:00 pm ACST
Monday to Friday
 
info@cosmosmagazine.com
 
PO Box 3652,
Rundle Mall SA 5000, Australia
 
55 Exchange Place,
Adelaide SA 5000, Australia

source