Machine vision systems find a wide range of applications, including self-driving cars, object detection, crop monitoring and smart cameras. Such a vision is often inspired by the vision of biological organisms. For example, human and insect vision inspired terrestrial artificial vision, while fish eyes led to aquatic artificial vision. While progress has been remarkable, current artificial visions suffer from certain limitations: they are not suitable for imaging terrestrial and underwater environments, and are limited to a hemispherical field of view (FOV) (180°).
To overcome these problems, a group of Korean and American researchers, including Professor Young Min Song from the Gwangju Institute of Science and Technology in Korea, have now designed a new machine vision system with omnidirectional imaging capability. , which can work in both aquatic and aquatic environments. terrestrial environments. Their study was posted online on July 12, 2022 and published in Natural electronics July 11, 2022.
“Bio-inspired vision research often results in a new development that did not exist before. This, in turn, allows for a deeper understanding of nature and ensures that the imaging device developed is both structurally and functionally efficient,” says Professor Song, explaining his motivation behind the study.
The inspiration for the system came from the fiddler crab (Uca arcuata), a species of semi-terrestrial crab with amphibious imaging capability and a 360° field of view. These remarkable features result from the ellipsoidal ocular shaft of the Fiddler Crab’s compound eyes, allowing panoramic imaging, and flat corneas with a graduated refractive index profile, allowing amphibian imaging.
As a result, the researchers developed a vision system consisting of a flat micro-lens array with a graduated refractive index profile that was integrated into a comb-shaped flexible silicon photodiode array and then mounted on a spherical structure. The graduated refractive index and flat surface of the micro lens have been optimized to compensate for defocus effects due to changes in the external environment. Simply put, light rays traveling through different media (corresponding to different indices of refraction) have been made to focus in the same place.
To test the capabilities of their system, the team performed optical simulations and imaging demonstrations in air and water. Amphibious imaging was performed by submerging the device halfway in water. To their delight, the images produced by the system were clear and without distortion. The team further showed that the system had a panoramic visual field, 300oh horizontally and 160oh vertically, in the air and in the water. Additionally, the spherical mount was only 2cm in diameter, making the system compact and portable.
“Our vision system could pave the way for 360° omnidirectional cameras with applications in virtual or augmented reality or all-weather vision for autonomous vehicles,” Professor Song enthusiastically speculates.
Source of the story:
Material provided by GIST (Gwangju Institute of Science and Technology). Note: Content may be edited for style and length.