Autonomy requires gobs of sensors, and although Elon Musk and Tesla have publicly said “no thanks” to laser-based lidar sensing in favor of cameras, most of the rest of the industry expects to augment cameras with this “light detection and ranging” technology. For the past few years, the race has been on to replace those spendy spinny laser scanners festooning the roofs of the current crop of robo-taxis and autonomous prototype test vehicles with cheaper, smaller, lighter designs that are easier to integrate into the bodywork. At CES, plenty of options were found featuring lasers that still move, mirrors that move (microelectromechanical systems, or MEMS), and completely solid-state flash lidar concepts. One even involves cameras to interest Mr. Musk.
This design adds a new dimension: time. That’s because 20 times per second, it flashes the road ahead with an infrared light pulse recording 60 megabits per second. The reflected 2.0-megapixel image is recorded on an optical camera chip in raw form and in an infrared modulated form. The images are compared, and the degree to which any pixel on the modulated image is dimmer than its unmodulated counterpart indicates that photon’s “time in flight,” from which distance to that object is calculated. Many competing flash systems integrate computational circuitry on the receiving sensor board, which diminishes resolution, but here every pixel counts. Range is said to be 100-plus meters, the optimal viewing angle is 54 degrees, visible light levels don’t affect it, and rain, fog, and dust merely curtail a bit of the system’s ultimate range. The resulting image is detailed enough to consider integrating object detection in this device, simplifying the sensor fusion task of the host vehicle. Cost in high-volume production is expected to be $200 or so. Several OEMs have expressed interest.
...