Frequently Asked Questions (FAQ)
What is this project about?
This is a new form of LIDAR technology that enables slow cameras (like videocameras) to image high-frequency information (GHz bandwidth signals). Beat notes are lower in frequency so can be detected using low-bandwidth electronics.
What exactly are beat notes?
It is perhaps more natural to think about beat notes in context of sound. Imagine you are in a concert. IF two singers are slightly out of tune - one producing a pitch at 440 Hz and the other at 420 Hz - the beat note is the 20 Hz difference freuqency. The same is true with modulated light beams, where we can interfere two GHz modulated light beams to get a Hz frequency beat note.
How does our technique compare with existing 3D scanning systems?
Our lab and others have been studying 3D scanning for some time, using for example, polarization of light. In contrast to existing systems, this new IEEE Access paper uses time of flight information alone with low-framerate sensors to obtain micron-scale pathlength resolution in a vibration-free setting. By using only time of flight infomration, we are less dependent on material variation than polarimetric or shading based approaches.
What will it take to deploy this onto self-driving cars?
Right now the unit is a bit cumbersome and scans at only a single pixel. We will need to address phase unwrapping and wide-field imaging to fully port this to robotic systems.
What are some other applications of this work?
We have demonstrated pathlength control at a minimum of around 2-3 micrometers. This is about one tenth the width of a human hair. Such high pathlength accuracy could potentially enable inversion of scattering, allowing doctors to potentially see deeper through tissue using visible light. Robots can potentially navigate through an orchard and not just map out the topology, but perhaps assess which fruits are ripe.
Does the technique work in real-time?
At a single pixel, the results in the paper are collected in real time (faster than 30 Hz). A potential wide-field implementation would therefore be real-time.