I often wonder if there is something beyond our grasp and perhaps “other-worldly” about elemental Silicon (Si), which is the primary material for most ICs and discrete semiconductors work. Yes, there are all-Germanium (Ge) devices in use (still!), as well as SiC, GaN, SiGe, GaAs and other mixed-material substrates available (I mean no insult or diminishment to those other materials – they all have their role to play) but Silicon is “it” for most work. Even more amazing is that raw Silicon is “just” sand, although it requires a significant amount of advanced material-science expertise and processing to become an ingot of 6”, 8”, 12” (300 mm) diameter and near-perfect crystalline structure, of course.
In addition to being the basis for electronic circuits, it’s also used for MEMS devices sensing pressure, acceleration, orientation, and other physical variables. Biological sensors, using silicon etched with microchannels, direct a single-file line of blood cells past targeted sensors, while silicon is coated on its surface with reagents to sense specific gases and pollutants, or an LED/photosensor arrangement counts the cells and scans them for various attributes. There is also its use in advanced optoelectronics: although Silicon itself is not viable as an LED material, it can be used for microchannels which act as optical waveguides leading to highly integrated electro-optical devices.
That’s where silicon’s characteristics are now being applied to create phased-array beam steering for dynamic lensless imaging. The implementation of beam-steering principles fairly common across the electromagnetic spectrum (both RF and optical) as well as the audio spectrum. In RF, it’s been used for dynamic beamforming in cellular systems and RADAR antennas for years (and in 5G MIMO which is a form of beam steering); in optics, it is used for special telescope arrays for interferometry-based research. But optical phased-arrays have not been used at the “micro” scale, as these arrays require numerous discrete emitters or sensors.
Silicon-based optical technology may be able to change that, even for mass-market consumer products, and even enable a super-thin, lensless camera. Researchers at the California Institute of Technology (Caltech) have created such a device, although it is still crude. The full details are in their brief Optical Society of America Technical Digest paper “An 8×8 Heterodyne Lens-less OPA Camera.” If I were planning to create a lenless array for image capture, I would try to place an array of photosensors on the die, and then tightly control when I sampled their output to create the phased-array effect. That’s seems logical, at first…except that it isn’t feasible.
What they did instead, in this 64-element design, is to use a thin Silicon-photonics integrated SOI substrate, topped with an array of optical-grating couplers to capture the incident light, Figure 1 . Each grating guides the light into a waveguide which, in turn, is routed to a directional optical coupler and then compared to a reference light in a heterodyne scheme. The output of the coupler goes to a pair of balanced photodiodes and the signal is mixed down to an electrical IF in the MHz range (hmmm….sounds like a standard RF-receiver architecture at this point).
a) Functional schematic of the lensless camera based on an integrated, all-optical phased array topology; (b) Photograph of the complete IC device. (Image courtesy of Caltech paper “An 8×8 Heterodyne Lens-less OPA Camera.”)
Next, to quote from their paper, “The output current of all the photodiodes associated with the receiving elements are summed up by placing them in parallel electrically producing the output signal of the receiver. The reference light is coupled in through a grating coupler and split into 64 paths. Each path goes through a PIN diode phase shifter and feeds a directional coupler. A receive beam is formed by adjusting the phase shifts of each path so that the amplitude of the signals arriving from a certain direction add constructively, while rejecting the intensity of incident light from other directions .” The constructive interference produces the phase-shifting effect and thus the steerable array. (Their paper, of course, gives specifics on the laser wavelength used and the frequencies of the heterodyning process as well.)
“Wow” is all I can say. Yes, it’s just 8×8 resolution at this point, but recall that Intel’s first DRAM, the legendary 1103, introduced in 1970, was just 1024 bits (1 kB) – and look where DRAM capacity is now. This lensless array has a beam width of 0.75o and its steerable range (field of view) is just 8o , but hey, it’s a start, Figure 2 .
The silicon optical phased-array chip with 8×8 phased array measures 1.4 × 1.8 mm. (Image courtesy of aws)
Will this become a viable alternative or even competitor to lens-based optics? Or will it be limited to niches which need ultrathin, highly integrated imaging? Or perhaps in a decade or two become the “lens” of choice. Obviously, none of us know that future. After all, it wasn’t that long ago that the pundits assured us that CMOS imagers would never be as good as charge-coupled devices (CCDs) for the foreseeable future and look where we are now: CCDs are still used in highly specialized instrumentation, of course, but CMOS sensors dominate the vast majority of applications and do so with extremely high resolution and color fidelity.
What do you see for silicon-based optics in a few years and even decades? Your assessment could be correct as anyone else’s forecast!