• Login
  • Register

Work for a Member company and need a Member Portal account? Register here with your company email address.

Project

High-frequency LIDAR using beat notes

Time of Flight 3D cameras like the Microsoft Kinect are prevalent in computer vision and computer graphics. In such devices, the power of an integrated laser is amplitude modulated at megahertz (MHz) frequencies and demodulated using a specialized imaging sensor to obtain sub-cm range precision. To use a similar architecture and obtain micron range precision, this paper incorporates beat notes. To bring telecommunications ideas to correlation ToF imaging, we study a form of "cascaded Time of Flight" that uses a Hertz-scale intermediate frequency to encode high-frequency pathlength information. We show synthetically and experimentally that a bulk implementation of opto-electronic mixers offers: (a) robustness to environmental vibrations; (b) programmability; and (c) stability in frequency tones. A fiberoptic prototype is constructed, which demonstrates three micron range precision over a range of two meters. A key contribution of this paper is to study and evaluate the proposed architecture for use in machine vision.

Frequently Asked Questions (FAQ)

What is this project about?
This is a new form of LIDAR technology that enables slow cameras (like videocameras) to image high-frequency information (GHz bandwidth signals). Beat notes are lower in frequency so can be detected using low-bandwidth electronics.

What exactly are beat notes?
It is perhaps more natural to think about beat notes in context of sound. Imagine you are in a concert. IF two singers are slightly out of tune - one producing a pitch at 440 Hz and the other at 420 Hz - the beat note is the 20 Hz difference freuqency. The same is true with modulated light beams, where we can interfere two GHz modulated light beams to get a Hz frequency beat note. 

How does our technique compare with existing 3D scanning systems?
Our lab and others have been studying 3D scanning for some time, using for example, polarization of light. In contrast to existing systems, this new IEEE Access paper uses time of flight information alone with low-framerate sensors to obtain micron-scale pathlength resolution in a vibration-free setting. By using only time of flight infomration, we are less dependent on material variation than polarimetric or shading based approaches. 

What will it take to deploy this onto self-driving cars?
Right now the unit is a bit cumbersome and scans at only a single pixel. We will need to address phase unwrapping and wide-field imaging to fully port this to robotic systems. 

What are some other applications of this work?
We have demonstrated pathlength control at a minimum of around 2-3 micrometers. This is about one tenth the width of a human hair. Such high pathlength accuracy could potentially enable inversion of scattering, allowing doctors to potentially see deeper through tissue using visible light. Robots can potentially navigate through an orchard and not just map out the topology, but perhaps assess which fruits are ripe. 

Does the technique work in real-time?
At a single pixel, the results in the paper are collected in real time (faster than 30 Hz). A potential wide-field implementation would therefore be real-time. 

Research Topics
#imaging