Posted in | News | Microbotics

This Artificial Eye Could Give Drones Superhuman Awareness

Researchers have developed an insect-scale artificial compound eye that combines ultra-fast motion detection with chemical sensing, an advance that could significantly improve navigation safety and environmental awareness in next-generation drones and autonomous systems.

Scutiphora pedicellata is a species of insect in the jewel bug family.

Study: An insect-scale artificial visual-olfactory bionic compound eye. Image Credit: j_fredz/Shutterstock.com

In a study published in Nature Communications, the team introduced a miniature artificial compound eye system that integrates both vision and smell.

By pairing micro-lenses mounted on flexible photodetectors with a printed chemical sensor array, the device achieves wide-field imaging while detecting hazardous gases. Roughly the size of an insect’s eye, it can avoid obstacles and monitor environmental targets at the same time, offering a more perceptive sensing platform for robots and unmanned vehicles.

Background

Artificial compound eyes draw direct inspiration from arthropods, whose visual systems are optimized for wide-angle, low-distortion perception. These qualities are particularly valuable in robotics, where panoramic awareness and rapid motion detection often matter more than fine image detail.

To date, however, efforts to replicate this capability have followed two largely separate paths. One approach integrates microlens arrays with conventional silicon sensors, but this setup typically demands complex optical alignment and can suffer from uneven image quality. The other relies on flexible photodetectors, which better match curved geometries yet have remained too bulky to achieve true insect-scale miniaturization.

As a result, a clear gap has persisted as no system has successfully combined dense, arthropod-scale visual units with additional sensory modalities in a compact format. Addressing this challenge requires not only shrinking the hardware but also integrating multiple sensing mechanisms into a unified architecture.

This study closes that gap.

The researchers fabricated a miniature bionic compound eye (bio-CE) directly onto a curved, flexible surface using advanced laser polymerization. The resulting structure contains 1027 naturally isolated visual units (ommatidia) arranged to mimic biological compound eyes. Crucially, the team also incorporated an olfactory sensor array through inkjet printing, creating a single insect-scale platform capable of fusing visual and chemical information for navigation and environmental monitoring.

Materials and Methods

To realize this design, the team began with a layered microfabrication process. A sacrificial aluminum layer was first deposited onto a silicon wafer, followed by spin-coating a flexible polyimide substrate. Gold electrodes and interconnects were patterned using photolithography and electron beam evaporation, with SU-8 serving as an electrical isolation layer. Once released from the silicon base, the structure was ready for optical integration.

At this stage, a photosensitive layer composed of poly(3-hexylthiophene-2,5-diyl) (P3HT), phenyl C61 butyric acid methyl ester (PCBM), and lead(II) sulfide (PbS) quantum dots was spin-coated, annealed, and encapsulated in polydimethylsiloxane (PDMS). This combination enabled broad spectral sensitivity while preserving flexibility.

With the photodetector array in place, the researchers printed a microlens array directly onto the PDMS surface using femtosecond laser two-photon polymerization. Because each microlens was precisely aligned with an underlying pixel, the design ensured effective optical isolation between neighboring units. The finished device was then mounted onto a 3D-printed cylindrical fixture, completing the visual subsystem.

Parallel to this process, the olfactory component was fabricated using inkjet printing. The colorimetric sensor array incorporated six chemoresponsive indicators, including metalloporphyrins, phthalocyanines, and pH dyes, dissolved in an ethanol–water mixture with polyvinylpyrrolidone.

Upon exposure to target gases, these materials produced measurable color shifts within 30 to 120 seconds. The changes were captured using a flatbed scanner and analyzed through red–green–blue (RGB) signal processing.

To evaluate integrated performance, the bio-CE was connected to an Artix-7 field-programmable gate array (FPGA) for real-time data acquisition, signal processing, and motor control. This configuration enabled closed-loop obstacle avoidance in an unmanned ground vehicle.

In aerial tests, a drone hovered at fixed points for 60 seconds while logging global positioning system (GPS) coordinates and sensor outputs. A pretrained machine learning model translated color variations into gas concentration estimates, allowing spatial chemical mapping after flight.

Results and Performance Evaluation

The completed bio-CE demonstrated how tightly integrated microfabrication and biological inspiration could yield a compact yet capable sensing platform. The curved device housed 1027 microscopic lenses, printed directly onto a custom organic photodetector array. This architecture reproduced several defining features of a fly’s eye, including natural optical isolation between units, a 180-degree field of view, and a 1 kilohertz (kHz) flicker fusion frequency that supported extremely rapid motion detection.

Importantly, visual performance was only part of the system’s capability. The integrated olfactory array expanded its perceptual range by detecting hazardous gases through distinct color-change reactions. Quantum dot–enhanced photodetectors provided sensitivity from ultraviolet to infrared wavelengths and responded in just 0.1 milliseconds (ms). Anti-fogging microstructures further helped maintain stable operation in humid environments, reinforcing the device’s practical robustness.

These capabilities translated effectively into real-world demonstrations. When mounted on an omnidirectional unmanned vehicle, the bio-CE enabled real-time obstacle avoidance using insect-inspired navigation algorithms.

In drone-based experiments, the system simultaneously tracked a moving light source in three-dimensional space and classified volatile toxic chemicals with 93 % accuracy. Together, these tests illustrated the value of combining high-speed panoramic vision with chemical awareness in a single compact unit.

That integration, however, involved trade-offs. Because the system prioritized wide-field motion sensitivity and high temporal resolution, its spatial resolution was lower than that of conventional high-density CMOS cameras. While this limited fine-detail reconstruction, it also reduced computational load and supported energy-efficient processing, which is an advantage for small, power-constrained platforms.

Conclusion

Overall, this work presents a cohesive vision–olfactory sensing system at true insect scale, overcoming key fabrication and integration challenges along the way.

By uniting precision laser microprinting, flexible electronics, and inkjet-printed chemical sensors, the bio-CE delivers wide-field motion detection, rapid gas identification, and demonstrated navigation performance within a single lightweight device.

As research continues, refining multimodal data fusion algorithms and pushing miniaturization even further could broaden its applications in autonomous drones, micro-robots, and distributed environmental monitoring systems.

Journal Reference

Wang, J., Wei, S., Qin, N., & Tao, T. H. (2026). An insect-scale artificial visual-olfactory bionic compound eye. Nature Communications. DOI:10.1038/s41467-026-68940-0. https://www.nature.com/articles/s41467-026-68940-0

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2026, February 12). This Artificial Eye Could Give Drones Superhuman Awareness. AZoRobotics. Retrieved on February 12, 2026 from https://www.azorobotics.com/News.aspx?newsID=16331.

  • MLA

    Nandi, Soham. "This Artificial Eye Could Give Drones Superhuman Awareness". AZoRobotics. 12 February 2026. <https://www.azorobotics.com/News.aspx?newsID=16331>.

  • Chicago

    Nandi, Soham. "This Artificial Eye Could Give Drones Superhuman Awareness". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=16331. (accessed February 12, 2026).

  • Harvard

    Nandi, Soham. 2026. This Artificial Eye Could Give Drones Superhuman Awareness. AZoRobotics, viewed 12 February 2026, https://www.azorobotics.com/News.aspx?newsID=16331.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.