Most of us don’t think much of fruit flies other than as noisy nuisances with their sights set on spoiled food. 

However, according to Jonathan Schneider and Joel Levine, researchers in UTM’s Department of Biology, fruit flies, or Drosophila melanogaster, have a higher capacity for visual comprehension than previously believed.

Schneider, a postdoctoral fellow, and his supervisor, Levine, Chair of UTM’s Biology Department and a senior fellow at the Canadian Institute for Advanced Research (CIFAR) Child & Brain Development program, detailed their research in a paper published in the October issue of PLOS One. 

The research was funded by a CIFAR Catalyst grant and conducted in collaboration with Nihal Murali, a colleague from the Department of Machine Learning at the University of Guelph’s School of Engineering, and Graham Taylor, a Canada Research Chair in Machine Learning.

Though fruit flies have a limited scope of vision, they possess an incredibly layered and organized visual system, including hyperacute photoreceptors. 

Schneider and Levine wanted to determine whether fruit flies, despite their limited input image, could distinguish individual flies.

To do so, the researchers equipped a machine with 25,000 artificial neurons to mimic the eye of a fruit fly. They then recorded 20 individual flies — 10 male, 10 female — for 15 minutes for three days using a machine vision camera. From these recordings, they developed standardized images, which they resized to imitate the images the flies perceived. 

They showed the images to ResNet18 — a computer algorithm without the constraints of ‘fly eye’ technology — their ‘fly eye’ machine, and human participants. All three were tasked with re-identifying the fly whose images they had been shown.  

The results indicated that fruit flies can extract meaning from their visual surroundings and can even recognize individual fruit flies, something that even fly biologists have had trouble with. 

“So, when one [fruit fly] lands next to another,” explains Schneider to Science Daily, “it’s ‘Hi Bob, Hey Alice.’”

Fruit flies’ extent of visual comprehension has implications for their social behaviour, and this study could help researchers learn how they communicate. 

As well, these findings are significant because while most programs designed to mimic human capacity — such as virtual assistants like Siri, Alexa, and Google Assistant — come close to it, rarely do they go beyond it, like with the ‘fly eye.’

Machines like these can bridge the gap between engineers and neurobiologists. The former can use their findings to design their machines as biologically realistic as possible. 

The latter can use that biological accuracy to hypothesize how visual systems process information and, as Schneider and his colleagues put it, “uncover not just how [fruit flies], but all of us, see the world.”