Image sensors measure light intensity, but the angle, spectrum, and other aspects of light must also be extracted to significantly advance machine vision.
In Applied Physics Letterspublished by AIP Publishing, researchers from the University of Wisconsin-Madison, Washington University in St. Louis, and OmniVision Technologies highlight the latest nanostructured components embedded on image sensor chips that are those most likely to have the greatest impact on multimodal imaging.
Developments could allow autonomous vehicles to see around bends instead of just a straight line, biomedical imaging to detect anomalies at different tissue depths, and telescopes to see through interstellar dust.
“Image sensors will gradually undergo a transition to become the ideal artificial eyes of machines,” said co-author Yurui Qu, of the University of Wisconsin-Madison. “An evolution leveraging the remarkable achievements of existing imaging sensors is likely to generate more immediate impacts.”
Image sensors, which convert light into electrical signals, are made up of millions of pixels on a single chip. The challenge is how to combine and miniaturize multifunctional components within the sensor frame.
In their own work, the researchers detailed a promising approach to detecting multiband spectra by fabricating an on-chip spectrometer. They deposited photonic crystal filters made of silicon directly above the pixels to create complex interactions between the incident light and the sensor.
The pixels under the films record the distribution of light energy, from which light spectral information can be inferred. The device – less than one hundredth of a square inch in size – is programmable to respond to various dynamic ranges, resolution levels and almost any spectral regime, from visible to infrared.
The researchers built a component that senses angular information to measure depth and build 3D shapes at subcellular scales. Their work was inspired by directional auditory sensors found in animals, such as geckos, whose heads are too small to determine where sound is coming from in the same way as humans and other animals. Instead, they use coupled eardrums to measure the direction of sound in size an order of magnitude smaller than the corresponding acoustic wavelength.
Similarly, pairs of silicon nanowires have been constructed as resonators to support optical resonance. The optical energy stored in two resonators is sensitive to the angle of incidence. The wire closest to the light sends the strongest current. By comparing the stronger and weaker currents of the two wires, the angle of the incoming light waves can be determined.
Millions of these nanowires can be placed on a one square millimeter chip. The research could support advances in lensless cameras, augmented reality and robotic vision.