THz imaging with fewer sensors?

MIT sparse phased arrayMIT has developed a technique that could reduce the number of sensors required for THz or mm-wave imaging by a factor of 10, or even 100. The work could have implications for high-resolution radar and sonar.

In a phased array, an incoming electromagnetic or sound wave strikes all of the sensors in the array and the system determines the origin and intensity of the wave, through calculation, by comparing the phases at which it arrives at each sensor.

According to MIT, as long as the distance between sensors is no more than half the wavelength of the incoming wave, that calculation is fairly straightforward – a matter of inverting the sensor measurements. But if the sensors are farther than half a wavelength apart, spatial aliasing occurs – inversion yields multiple solutions spaced at regular angles around the sensor array.

In most applications of lower-frequency imaging, however, any given circumference around the detector is usually sparsely populated – which is the phenomenon the new system exploits.

“Think about a range around you, like five feet [1.5m],” said MIT professor Gregory Wornell. “There’s actually not that much at five feet around you. Or at 10 feet. Different parts of the scene are occupied at those different ranges, but at any given range, it’s pretty sparse. Roughly speaking, the theory goes like this: If, say, 10% of the scene at a given range is occupied with objects, then you need only 10% of the full array to still be able to achieve full resolution.”

The trick is to determine which 10% of the array to keep.

Retaining every tenth sensor won’t work as it is the regularity of the distances between sensors that leads to aliasing, said MIT. Arbitrarily varying the distances between sensors would solve that problem, but it would also make calculating the wave’s source and intensity prohibitively complicated.

For a one-dimensional (linear) detector, Wornell instead prescribes a detector along which the sensors are distributed in pairs. The regular spacing between pairs of sensors ensures that the scene reconstruction can be calculated efficiently, but the distance from each sensor to the next remains irregular.

To go with the concept, the researchers have developed an algorithm that determines the optimal sensor distribution – in essence, the algorithm maximises the number of different distances between arbitrary pairs of sensors.

Radar-frequency experiments in a car park have verified the predictions of the theory.

With a two dimensional array, savings could be 100x and 10x can be obtained in each dimension,

A paper ‘Multi-coset sparse imaging arrays‘ appears in the latest issue of IEEE Transactions on Antennas and Propagation

Tags: automotive, MIT, MIT professor Gregory Wornell, sensors

Related Tech News

Share your knowledge - Leave a comment