I’ve used small LIDAR sensors on toy (Arduino based) robots and they operate well within 1m or so. Ultrasonics sensors are another alternative but we found them very susceptible to noise and surface abrasion. With decent LIDAR sensors used in drones and vehicles, they work up to 215m or so.
But research in the lab (ScienceDaily article: Want to catch a photon, start by silencing the sun) has created LIDAR sensors that uses a novel form of analog/optical noise suppression that is capable of using these same LIDAR sensors and using them to map up to 45km of space.
The researchers were able to add a quantum marker to LIDAR beam photon(s) and then filter beam reflections to only honor those reflected photons with the quantum marker. The ScienceDaily article was based on a Nature Communications article, Noise-tolerant single photon sensitive three-dimensional imager.
What’s been changed?

They call the methodology implemented by their device, Quantum Parametric Mode Sorting or QPMS. It’s not intended to compete with software or computational approaches for noise filtering but rather complement those capabilities with a more accurate LIDAR, that can eliminate the vast majority of noise using non-linear optics (see Wikipedia article on Non-linear optics to learn more)..
It turns out the researchers are able to image space with their new augmented LIDAR using a single photon per pixel. They use an FPGA to control the system and programable ODL(optical delay line, delay’s optical signals), with up conversion single photon detector (USPD, that takes one or more photons at one frequency and converts them to another, higher frequency photon) and a silicon avalanche photo diode (SI-APD, which detects a single photon and creates an avalanche [of multiple electrons?] electrical signal from it.
How well does it work?
To measure the resolution capabilities of the circuit they constructed a 50x70mm (~2×2 3/4″) CNC machined aluminums depth resolution calibration device (sort of like an eye chart only for depth perception) see (2c and 2d below) and were able to accurately map the device column topologies.

They were also able to show enhanced perception and noise reduction when obscuring a landscape (Einstein’s head) with an aluminum screen which would be very hard for normal solutions to filter out. The device was able to clearly reconstruct the image even through the aluminum screen.

The result of all this is an all optical fibre noise reduction circuit. I’m sure the FPGA ,SI-APD, USPD, MLL, Transciever, ODL and MEM are electronics or electro-mechanical devices,, but the guts of the enhanced circuit seems all optical.
What does it mean?
What could QPMS mean for optical fibre communications. It’s possible that optical fibres could use significantly less electro-optical amplifiers, if a single photon could travel 45km without noise.
Also LiFi (light fidelity) or open air optical transmission of data could be significantly improved (both in transmission length and noise reduction) using QPMS. And rone could conceivably use LiFi outside of office communications, such as high bandwidth/low-noise, all optical cellular data services for devices. .
And of course boosting LIDAR length, noise reduction and resolution could be a godsend for all LIDAR mapping today. I readi another article (ScienceDaily: Modern technology reveals … secrets of great, white Maya road) about archeologist mapping the (old) Maya road through the jungles of central America using LIDAR equipped planes. I imagine a QPMS equiped LIDAR could map Mayan foot paths.
~~~~
Comments?