Fiddler crab-inspired AI vision system could find use in autonomous aircraft


The Fiddler crab’s eye structure has inspired researchers at the Gwangju Institute of Science and Technology researchers to develop a novel artificial vision system which has potential for use in autonomous aircraft.

Artificial vision systems are used in motion sensing, object detection, and self-driving vehicles. The system is often inspired by the vision of a biological organism. Human and insect vision have inspired terrestrial artificial vision, whilst fish eyes have led to aquatic artificial vision, for example.

However, current artificial visions cannot be used to image both land and underwater environments and they are also limited to a hemispherical (180°) field-of-view (FOV). The Fiddler crab has a 360-degree FOV which gives it omnidirectional imaging ability. It can also work in both aquatic and terrestrial environments.

Professor Young Min Song, Gwangju Institute, said: “Research in bio-inspired vision often results in a novel development that did not exist before. This, in turn, enables a deeper understanding of nature and ensure that the developed imaging device is both structurally and functionally effective.”

The Fiddler crab, latin name Uca arcuata, is a semi-terrestrial crab species. Its remarkable vision features result from the ellipsoidal eye stalk of the fiddler crab’s compound eyes, enabling panoramic imaging, and flat corneas with a graded refractive index profile, allowing for amphibious imaging.

Researchers developed a system consisting of an array of flat micro-lenses with a graded refractive index profile that was integrated into a flexible comb-shaped silicon photodiode array and then mounted onto a spherical structure. The graded refractive index and the flat surface of the micro-lens were optimised to offset the defocusing effects due to changes in the external environment.

The system’s capabilities were tested through a number of optical simulations and imaging demonstrations in air and water. Amphibious imaging was performed by immersing the device halfway in water. Which produced images clear and free of distortions, researchers reported. The team also showed that the system had a panoramic visual field, 300-degree horizontally and 160-degree vertically, in both air and water.

Professor Song added: “Our vision system could pave the way for 360-degree omnidirectional cameras with applications in virtual or augmented reality or an all-weather vision for autonomous vehicles.”