We had the good fortune of bright and sunny weather on the day of our joyride, but Russell said that the lidar’s fidelity is such that snow and rain have a minimal impact on range. “You basically have to over-spec your sensor for clear weather conditions, such that when you have inclement weather, you can still see objects really, really well,” he said. They might focus on the horizon when the car’s on the freeway and hone in on pedestrians during city driving. Those algorithms can increase or decrease the density of the point cloud dynamically, Russell explained. “Some of our customers have effectively gotten to the point where they’re able to reduce their computer for certain algorithms where … they can run on something as small as a Raspberry Pi.” “Once you’re working with data that is orders of magnitude better fidelity, it makes a lot of the problems that were seemingly impossible from a perception standpoint a heck of a lot easier,” Russell said. The lidar had detected it through the rear windshield. He pointed to a headrest in a car a few lanes to the right. A colorful, undulating point cloud, rendered with a latency of mere milliseconds, converged around vehicles, pedestrians, street signs, and even lane markings that Russell zoomed in and around from multiple vantage points with arrow keys on a keyboard. Luminar’s indium phosphide semiconductor wafers also produce a much higher-fidelity image than lower-wavelength lidar systems - as low as 1 frame per second for the highest level of detail or up to 20 frames per second at the cost of some resolution.ĭuring a zip around Madison Square Park in a van with a Luminar lidar strapped to the roof (and a large computer monitor stuck to the back of the passenger seat headrest), Russell showed me the raw sensor data. “It’s hard to argue with physics and eye safety,” Russell said. With 1550 nanometers, because it’s eye-safe rather than focused to a point on the retina … You could effectively output 1 million times the pulse energy and still stay eye-safe, which is crazy.” “There’s actually a huge challenge with current systems … The reason why they can’t see very far is because if you were to increase the laser power any more, you’d start damaging people’s eyes. “ is basically the fundamental operating wavelength that you want to be at if you want to be even theoretically capable of high range and resolution performance, because it’s an eye-safe wavelength,” Russell said. Traditional photodetectors and application-specific integrated circuit ( ASIC) designs, which use silicon, operate at a wavelength of about 900 nanometers. Luminar’s secret sauce is indium gallium arsenide, an alloy that’s amenable to the 1550 nanometer operating wavelength. “We built it from the ground up: our own receivers, scanning mechanisms, processing electronics all in-house,” Russell told me. More importantly, it has a 120-degree field of view - two laser beams each cover a 60-degree field, steered by tiny mirrors that direct the beams through the field of coverage - and can detect car tires, bicyclists in black sweatshirts, and other objects with reflectivity as low as 5 percent, without sacrificing range. But Luminar’s lidar distinguishes itself by operating at the 1550 nanometer wavelength, which the company claims can deliver 40 times more power and 50 times better resolution than competing devices. The mechanical sensor, like other lidar devices, measures the distance between itself and objects by calculating the time it takes a laser pulse to scatter off of reflective surfaces. Luminar, a nearly 6-year-old lidar developer that emerged from stealth with a $36 million funding round in 2017, has designed one of the world’s first sensors capable of detecting objects up to 250 meters away. There’s a reason Russell is in high demand. “It’s been busy, yeah,” he said, as I sat down.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |