If you’re a drone pilot, you likely know there are certain risks you take on when flying through a fog. In some circumstances, the fog may trigger the drone’s obstacle avoidance sensors or prevent it from descending because the aircraft starts to think of the fog as a landing spot. Then, why on earth are NASA engineers flying drones in fog so thick, you can’t see three feet in front of you?
The drones are being flown at a special facility located in New Mexico. There, the fog is produced on demand in a chamber that stretches for 180 feet. Plastic sheets line the walls to trap the fog whose density is determined with scientific precision.
The drone, meanwhile, is a test target for the collection of sensors installed at the opposite end of the chamber. These are the same sensors that will be used in Advanced Air Mobility (AAM) vehicles such as urban air taxis in the not-so-distant future.
You see, there won’t be a human pilot onboard these air taxis. Instead, instruments such as optical and infrared cameras, radar, and LiDAR scanners will act as high-tech “eyes,” helping the aircraft to take off, fly, and land safely. This is why it’s imperative the designers of unpiloted passenger aircraft know how their sensors would be impacted by fog.
For perception technology, fog is an extreme environment. But for commuters, it’s calm and common enough to want to fly in. And as Nick Cramer, a research engineer at NASA’s Ames Research Center, explains:
Each sensor has its strengths and weaknesses, and they’re affected by fog to different degrees. We don’t know which will end up on these vehicles, so we are testing a suite of sensors in the chamber to quantify their pros and cons.
Take LiDAR scanners, for example. The signals emitted by the device might reflect off the water droplets in fog, instead of the objects they’re meant to detect. Other sensors may be impacted in a different manner.
In one of the tests conducted at the fog chamber at Sandia National Laboratories in Albuquerque, researchers measured how well sensors could detect the drone – or its warm motors, in the case of an infrared camera – from different distances through different levels of fog.
Creating these different levels of fog is no mean feat either. To produce fog made of particles of a larger size, for example, sprinklers fitted on the ceiling release a mixture of water and salt. “It’s actually difficult to get just right,” says Jeremy Wright, an optical engineer at Sandia. “Depending on the conditions, the water droplets want to either condense more water out of the air and grow or give water back to the air as humidity.”
Data for next generation of air taxi sensors
But the tests are important, and they must be done. Studying how far and how well today’s technology can see in foggy weather will help answer how safe an aircraft relying on them would be.
NASA says it will release the data for use by companies and researchers working to develop information processing techniques and improve sensors for AAM vehicles. The data, it’s hoped, will help to build accurate computer simulations, discover new challenges, and validate the technology for flight.
In the meantime, NASA will continue to research how best to use aircraft sensors. With different strengths and weaknesses, optical, radar, LiDAR, and other systems are complementary. A fusion of different sensors combined in the smartest ways will help make the market opened by AAM a safe, productive reality.