Skip to content
eoPortal

Other Space Activities

Heterodyne Lensless OPA Camera

Jul 12, 2017

Technology Development

Heterodyne Lensless OPA (Optical Phased Array) Camera

 

Traditional cameras — even those on the thinnest of cell phones — cannot be truly flat due to their optics: lenses that require a certain shape and size in order to function. At Caltech (Pasadena, CA), engineers have developed a new camera design that replaces the lenses with an ultra-thin OPA (Optical Phased Array). The OPA does computationally what lenses do using large pieces of glass: it manipulates incoming light to capture an image. 1) 2) 3)

Lenses have a curve that bends the path of incoming light and focuses it onto a piece of film or, in the case of digital cameras, an image sensor. The OPA has a large array of light receivers, each of which can individually add a tightly controlled time delay (or phase shift) to the light it receives, enabling the camera to selectively look in different directions and focus on different things.

"Here, like most other things in life, timing is everything. With our new system, you can selectively look in a desired direction and at a very small part of the picture in front of you at any given time, by controlling the timing with femto-second (10-15 s) — quadrillionth of a second — precision," says Ali Hajimiri, Bren Professor of Electrical Engineering and Medical Engineering in the Division of Engineering and Applied Science at Caltech, and the principal investigator of a paper describing the new camera. The paper was presented at the Optical Society of America's (OSA) Conference on Lasers and Electro-Optics (CLEO) and published online by the OSA in the OSA Technical Digest in March 2017. 4)

"We've created a single thin layer of integrated silicon photonics that emulates the lens and sensor of a digital camera, reducing the thickness and cost of digital cameras. It can mimic a regular lens, but can switch from a fish-eye to a telephoto lens instantaneously — with just a simple adjustment in the way the array receives light," Hajimiri says.

Phased arrays, which are used in wireless communication and radar, are collections of individual transmitters, all sending out the same signal as waves. These waves interfere with each other constructively and destructively, amplifying the signal in one direction while canceling it out elsewhere. Thus, an array can create a tightly focused beam of signal, which can be steered in different directions by staggering the timing of transmissions made at various points across the array.

A similar principle is used in reverse in an optical phased array receiver, which is the basis for the new camera. Light waves that are received by each element across the array cancel each other from all directions, except for one. In that direction, the waves amplify each other to create a focused "gaze" that can be electronically controlled.

"What the camera does is similar to looking through a thin straw and scanning it across the field of view. We can form an image at an incredibly fast speed by manipulating the light instead of moving a mechanical object," says graduate student Reza Fatemi (MS '16), lead author of the OSA paper.

In 2016, Hajimiri's team rolled out a one-dimensional version of the camera that was capable of detecting images in a line, such that it acted like a lensless barcode reader but with no mechanically moving parts. This year's advance was to build the first two-dimensional array capable of creating a full image. This first 2D lensless camera has an array composed of just 64 light receivers in an 8 by 8 grid. The resulting image has low resolution. But this system represents a proof of concept for a fundamental rethinking of camera technology, Hajimiri and his colleagues say.

"The applications are endless," says graduate student Behrooz Abiri (MS '12), coauthor of the OSA paper. "Even in today's smartphones, the camera is the component that limits how thin your phone can get. Once scaled up, this technology can make lenses and thick cameras obsolete. It may even have implications for astronomy by enabling ultra-light, ultra-thin enormous flat telescopes on the ground or in space."

"The ability to control all the optical properties of a camera electronically using a paper-thin layer of low-cost silicon photonics without any mechanical movement, lenses, or mirrors, opens a new world of imagers that could look like wallpaper, blinds, or even wearable fabric," says Hajimiri. Next, the team will work on scaling up the camera by designing chips that enable much larger receivers with higher resolution and sensitivity.

 

Description

The integration of photonics components on silicon chips has enabled new applications and novel system designs. Optical phased array transmitter is an example of such integrated systems that has attracted interest in recent years (Ref. 4). Proper adjustment of the phase of the light fed to each array element forms a beam of light by adding the output wave of the elements constructively in a certain direction and destructively in other directions. The beam direction is steerable electronically by controlling the phases. Reciprocity of the electromagnetic waves makes it possible to use the same principle for receiving the light coming from a certain direction and rejecting others. This enables a lensless ultra-thin synthetic aperture OPA camera by sweeping the beam in all directions and putting the measured intensity of the light coming from each direction together and thereby forming an image.

Design and implementation: The schematic diagram of the phased array receiver camera is shown in Figure 1(a) and the fabricated chip in Figure 1(b). An 8 x 8 array of grating couplers capture the incident light on the chip surface and each grating coupler guides the light into a waveguide. In this heterodyne scheme, each waveguide is then routed to a directional coupler where the received light is combined with the reference light. The output of the directional coupler is fed to a pair of balanced photodiodes where the signal is mixed down to an electrical intermediate frequency (IF) in the MHz range. The output current of all the photodiodes associated with the receiving elements are summed up by placing them in parallel electrically producing the output signal of the receiver. The reference light is coupled in through a grating coupler and split into 64 paths. Each path goes through a PIN diode phase shifter and feeds a directional coupler. A receive beam is formed by adjusting the phase shifts of each path so that the amplitude of the signal arriving from a certain direction add constructively, while rejecting the intensity of incident light from other directions. This is tantamount to looking in a certain direction.

Figure 1: (a) Schematic of the design (b) Photograph of the chip (image credit: Caltech)
Figure 1: (a) Schematic of the design (b) Photograph of the chip (image credit: Caltech)

Experimental setup and measurement results: The chip is mounted on a PCB (Printed Circuit Board) that includes electronic circuit for controlling the phase shifters on the chip and processing the output signal of the chip. A microcontroller is used to control the electronics and communicating with a computer. A 1550 nm laser is used as the source, whose associated frequency corresponds to f0=194 THz. The output light of the laser is split into two paths for reference and illumination. Each path is fed to a single-sideband (SSB) modulator. The two SSB modulators shift the optical light frequency in the reference and illumination by f1 = 1.15 MHz and f2 = 1.75 MHz, respectively. Due to the heterodyne mixing process, the output current of the photodiodes on the chip has the carrier frequency of fIF = f1 + f2 = 2.9 MHz. One advantage of this shifted frequency scheme is that random delay fluctuations in reference and illumination paths (e.g., thermal) manifest themselves as phase noise at the output, not affecting the measured amplitude and therefore not degrading the SNR of the detected signal amplitude.

Moreover, 1/f noise of the electronics needed for amplification and processing the output current falls out of band by using a high enough intermediate frequency (IF). A polarization maintaining fiber which carries the reference path is fixed to the chip with transparent adhesive. This allows moving and rotating the receiver for characterization and calibration. In practice, the optical path length of the waveguides has variations because of the fabrication mismatches. An optimization algorithm with random jump search is implemented to calibrate the fabrication mismatches before the imaging phase. The result of optimization provides a lookup table for phase shifter settings to form receiving beams for different azimuth, φ, and elevation angles, θ. Figure 2 shows the measured receiving pattern after calibration for 0º. The beam width is 0.75º and grating lobes are 8º apart enabling an 8-pixel by 8-pixel image.

Figure 2: (a) Normalized receiving pattern cross section of yz-plane, (b) Normalized receiving pattern cross section of xy-plane, (c) Normalized 3D pattern (image credit: Caltech)
Figure 2: (a) Normalized receiving pattern cross section of yz-plane, (b) Normalized receiving pattern cross section of xy-plane, (c) Normalized 3D pattern (image credit: Caltech)

To illustrate the imaging function of the receiver, a copper tape with a hole is pasted on the plexiglass to block the light except at the hole. The surface of the plexiglass is sanded to enhance scattering, Figure 3 (a) and (b). The image of the object is captured by loading the phase setting for different angles and measuring the optical density at each pixel. Figure 3(c) shows the gray scale graph of the measurement result. The white bright spot clearly shows the position of the hole.

Figure 3: (a) Sanded plexiglass, (b) Copper tape covered imaging target, (c) Captured image in gray scale (image credit: Caltech)
Figure 3: (a) Sanded plexiglass, (b) Copper tape covered imaging target, (c) Captured image in gray scale (image credit: Caltech)

In summary, a 2D lensless ultra-thin camera based on an optical phased array receiver is demonstrated. The camera has the beam width of 0.75º and 8º of field of view. The beam is electronically steerable for imaging.

Figure 4: The OPA chip placed on a penny for scale. This chip could be the lensless camera of the future (image credit: Caltech / Hajimiri Lab)
Figure 4: The OPA chip placed on a penny for scale. This chip could be the lensless camera of the future (image credit: Caltech / Hajimiri Lab)


References

1) "Ultra-thin camera creates images without lenses," Science Daily, June 22, 2017, URL:
https://www.sciencedaily.com/releases/2017/06/170622104049.htm

2) Billy Hurley, "Ultra-Thin Camera Says Good-Bye to the Lens," NASA Tech Briefs, 28 June 2017, URL:
http://www.techbriefs.com/component/content/article/1198-ntb/news/news
/27133-ultra-thin-camera-says-good-bye-to-the-lens?eid=376620570&bid=1807691

3) "Ultra-Thin Camera Creates Images Without Lenses," Caltech, June 21, 2017, URL: http://www.caltech.edu/news/ultra-thin-camera-creates-images-without-lenses-78731

4) Reza Fatemi, Behrooz Abiri, Ali Hajimiri, "An 8x8 Heterodyne Lens-less OPA Camera," Conference on Lasers and Electro-Optics OSA Technical Digest (online) (Optical Society of America, 2017), DOI: 10.1364/CLEO_AT.2017.JW2A.9, URL:
http://chic.caltech.edu/wp-content/uploads/2017/03/Cleo_2017_2D_OPA_V7.pdf
 


The information compiled and edited in this article was provided by Herbert J. Kramer from his documentation of: "Observation of the Earth and Its Environment: Survey of Missions and Sensors" (Springer Verlag) as well as many other sources after the publication of the 4th edition in 2002. - Comments and corrections to this article are always welcome for further updates (eoportal@symbios.space).