Mems optical phased array forms beams suitable for automotive lidar

Author: EIS Release Date: Aug 21, 2019


Micromachining can be used to create a infra-red beam-forming phased array, according to engineers at University of California, Berkeley, who have made a 160 x 160 array on a 3.1 x 3.2mm die that can be switched in 5.7μs (resonance is at 55kHz).

“It is capable of providing about 25,600 rapidly steerable spots within its field-of-view. The grating phase shifters are optimised for the near-infrared telecom wavelength bands from 1,200 to 1,700nm with 85% optical efficiency,” according to ‘2D broadband beamsteering with large-scale MEMS optical phased array‘, a paper that describes the work in Optica.

Phase-shifting is not achieved by ‘piston’-style micromirrors, which would move back and forth to vary path lengths by the phase shifts needed.

Instead, each pixel is a diffraction grating (of period 955nm), sitting on its own mems ‘comb’ type actuator beneath which moves the grating laterally by the 955nm (1μm max) needed to achieve a 2π shift range.

Light hits the grating at a glancing angle (65° from right angles) and, from the point of view of the beam, the diffraction lines are across its path, while the grating slides away and towards the beam.

Beam enters from top right, gratings shift top right to bottom left

In detail, each grating is designed to minimise reflection (which would be at 65° off to the other side) and maximise first-order diffraction, which is almost at right angles to the array surface.

This form phase shifting means that all the elements need only move in one axis, away and towards the beam’s glancing approach, to form a wave front consisting of 160 x 160 individually phase-shifted exiting beam elements – there is no need for two-dimensional movement, nor for long-throw piston-style movements away from the substrate.

Beam divergence (with the array forming a single beam) has been measured at 0.042 x 0.031°, and field of view is 6.6 x 4.4°.

The team is looking at automotive lidar as a possible application, to replace macro-scale spinning mechanical beam steering.

Liquid crystal phase-shifting arrays have not been fast enough for this (Berkeley claims 1,000x slower than its device), according to team member Youmin Wang (right in photo). “The biggest challenge for the current optical phased array used in lidar is its relative slow beam-steering speed for point-by-point scanning and its limited overall aperture which dominates the system’s optical resolution,” he said.

An earlier proof-of-concept 320 x 320 fixed array (with no actuators) was designed with the phase-shifts necessary to form a hologram of the Berkeley logo, which it did, closely matching theoretical predictions. Also, the researchers predict that CMOS could be put on the same chip as the moving array.

“Being able to program these chips allows us to go beyond scanning, we can program our arrays to be more like human eyes,” said Professor Ming Wu (left in photo). “This allows us to generate and perceive arbitrary patterns like our eyes do; we can track individual objects instead of just rotating scanning.”