In December 2018 thousands of holiday travelers were stranded at London’s Gatwick Airport because of reports of drones flying nearby. The airport—one of Europe’s busiest—was shut down for two days, which caused major delays and cost airlines millions of dollars.
Unauthorized drones in commercial airspace have caused similar incidents in the U.S. and around the world. To stop them, researchers are now developing a detection system inspired by a different type of airborne object: a living fly. This work could have applications far beyond drone detection, researchers write in a new paper published in the Journal of the Acoustical Society of America.
“It’s quite awesome,” says Frank Ruffier, a researcher at the Etienne-Jules Marey Institute of Movement Sciences at Aix-Marseille University in France and the French National Center for Scientific Research, who was not involved with the new study. “This basic research on the fly is solving a real problem in computer science.”
That solution has implications for, among other things, overcoming the inherent difficulty of detecting drones. As these remotely piloted flying machines become ever cheaper and more accessible, many experts worry they will become increasingly disruptive. Their prevalence raises a variety of issues, says Brian Bothwell, co-director of the Science, Technology Assessment and Analytics team at the U.S. Government Accountability Office. “Drones can be operated by both the careless and the criminal,” he notes. Careless drone pilots can inadvertently cause accidents; criminal ones can use these devices to smuggle drugs across national borders or drop contraband into prison yards, for example. “It’s important to detect them,” Bothwell says.
But such detection is far from simple. Current systems rely on visual, auditory or infrared sensors, but these technologies often struggle in conditions that have low visibility, loud noise or interfering signals. Solving the problem requires what computer programmers call “salience detection,” which essentially means distinguishing signal from noise.
Now, with some help from nature, a team of scientists and engineers at the University of South Australia, the defense company Midspar Systems and Flinders University in Australia may have found a solution. In their new paper, they demonstrate an algorithm that was designed by reverse engineering the visual system of the hoverfly—a family of mainly black-and-yellow-striped insects known for their habit of hovering around flowers. As anyone who has tried to swat a fly can attest, many of these buzzing pests have incredibly keen vision and fast reaction times. Such abilities stem from their compound eyes, which take in a lot of information simultaneously, and from the neurons that process that information—which turn out to be extremely good at separating relevant signals from meaningless noise. A vast range of animals have visual systems that effectively tune out noise, but the simple brains of flies—and the resulting ease of researching them—make the insects a particularly useful model for computer scientists.
For this study, the researchers examined the hoverfly’s visual system to develop a tool that uses similar mechanisms to clean up noisy data. The filtered information can then be fed into an artificial intelligence algorithm for drone detection. In their new paper, the scientists demonstrate that this combination can detect drones up to 50 percent farther away than conventional AI alone. The new research paper is only a proof of concept for the fly-vision algorithm’s filtering ability, but the team members have built a prototype and are working toward commercialization. Their efforts demonstrate how bio-inspired design can improve passive detection systems.
“This paper is a great example of how much we potentially can learn from nature about information processing,” says Ted Pavlic, associate director of research at Arizona State University’s Biomimicry Center, who was not involved in the new study.
To glean insights from the hoverfly, the team spent more than a decade carefully studying the neuronal pathways of its eyes and measuring their electrical responses to light. Starting with the photosensors in the insects’ large, compound eyes, the engineers traced the circuits through the various layers of neurons and into the brain. They then used that information to construct an algorithm that can sense and heighten the important parts of the data.
But instead of simply feeding visual data into the algorithm, the researchers fed it spectrograms—visual representations of sound—created from acoustic data recorded in an outdoor environment as drones flew by. The algorithm was able to view these squiggly graphs and heighten the important “signal” peaks that corresponded to frequencies emitted by drones. At the same time, it was able to lessen the background noise that was not created by drones.
“It’s really nice because it’s a cleaning-up step, and you can basically add it to any machine-learning pipeline and expect to get a benefit from it,” says Emma Alexander, a computer scientist at Northwestern University, who was not involved in the study.
In fact, the researchers say they do want to use their bio-inspired algorithm on a variety of applications where artificial intelligence must process information from the real world while dealing with complicated and messy conditions. “We have built a system that can automatically adapt to different environments and enhance the things that are of interest,” says study co-author Russell Brinkworth, a biological engineer at Flinders University.
For example, one of the major challenges that comes with building any AI-based sensing system is getting it to work in a constantly changing environment. “In traditional AI, you can’t just show it a picture of a car. You have to show it a car in every possible situation in which you could see a car,” he explains. “But if the lighting changes or there is a shadow, the AI will say it has never seen it before.” This is one of the big hurdles in designing autonomous vehicles that reliably adjust to changing light and other shifting conditions. With the fly-inspired system, however, this filtering happens automatically.
“Artificial intelligence works best when it’s in a confined environment and it’s controlled,” Brinkworth says. “But biology, on the other hand, works everywhere. If it doesn’t work everywhere, it dies.”
https://www.scientificamerican.com/article/ais-spot-drones-with-help-from-a-fly-eye/