Non-Line-of-Sight Imaging using Phasor Field Virtual Waves

Many existing imaging modalities base on the common underlying theory of wave propagation. Our recent work shows that it is possible to apply existing methods from diffraction and wave imaging to Non-Line-of-Sight imaging problems.
Non-Line-of-Sight Imaging using Phasor Field Virtual Waves
Like

Imaging can be performed using a variety of underlying physical phenomena including light, other electromagnetic radiation, sound, electrons, and gravitational waves. While the underlying physical phenomena in all these cases are very different, the principle describing how the image is created is essentially the same. This principle is the theory of wave propagation and diffraction. A shared underlying theory allows for a direct and effortless transfer of knowledge from one field of imaging to another. It allows researchers that have spent decades to become experts in one branch of imaging to quickly become experts in another. Many of the recent advances in optics such as imaging in turbid media and the study of metamaterials have first been implemented using microwaves and sound before being transferred to light waves (often by the same researchers). Similarly, the new field of gravitational wave imaging benefits immensely from access to centuries worth of knowledge about how to extract information from a wave.

The field of Non-Line-of-Sight (NLOS) imaging on the other hand, has thus far relied on an entirely separate set of models to create images. A NLOS imaging system illuminates points on a relay surface with short light pulses. From there the light travels into the scene, reflects off scene surfaces and returns to the relay surfaces. Points on the relay surface are imaged by one or multiple detector pixels fast enough to resolve the time of flight of the light through the scene. The result of this measurement is in general a five dimensional dataset containing time responses for each illuminated and each imaged point on the 2D relay surface. Current reconstruction methods make use of linear algebra and optimization, geometric optics, and concepts borrowed from computed tomography to obtain a three dimensional image of the scene from this data.

Virtual Wave imaging concept: We illuminate the relay wall with short focused light pulses at different locations and record the returned impulse response with a fast detector. Using this impulse response, we can compute the response of the scene to a virtual illumination wave. From the computed response waveform, we compute the reconstruction of the hidden scene using wave diffraction methods.

The key contribution of our work lies in realizing that NLOS imaging can be understood in terms of wave propagation, and that in doing so we can turn any diffuse wall into a virtual camera. We lay out a general Phasor Field Virtual Wave Imaging framework that allows to apply existing line of sight wave imaging techniques to obtain images, videos, and other information from NLOS data. In our paper we use our framework to demonstrate three new algorithms using this that all have intriguing novel capabilities:
  • A virtual photography camera, that like a line of sight photography camera can capture 2D images of a scene without knowledge about the illumination source.
  • A transient camera that can create videos of light transport through a scene and for the first time reveals 4th and 5th bounce reflections in light transport.
  • A time gated confocal camera that creates 3D reconstructions of the scene that are robust to multiple reflections, complex geometries, large scene depth and dynamic range, and work with remarkably noisy data requiring just a few million photons to reconstruct a room sized scene.

We, however, believe that by far the most important contribution is the framework itself as it provides an intuitive understanding of NLOS imaging and gives access to a large variety of mature image reconstruction and analysis methods to this relatively new field. In situations where our method applies, we can now make well supported statements about what aspects of a scene can be reconstructed at what resolution and how. While conducting our experiments and trying to debug our hardware and algorithms this knowledge has been particularly valuable as it allowed us to predict early on fairly accurately and intuitively what our final result would look like. We also now can make use of advanced algorithms and methods developed to solve wave diffraction integrals in other fields. This allows us to compute reconstructions quickly and efficiently as will be demonstrated in forthcoming publications.

We are anticipating a large number of NLOS imaging techniques using our framework and are excited to see it already being applied and extended by other researchers.


Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Electrical and Electronic Engineering
Technology and Engineering > Electrical and Electronic Engineering
  • Nature Nature

    A weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions.