How your iPhone can accelerate autonomous driving



Virtual CES 2021 sucked as much as Virtual Everything Else, but there was compelling news regarding lidar and autonomous driving. I’m not selected enough to separate lidar hype from heroism, so I hooked up with Tom Jellicoe, an optics expert at Technology Partnership, a UK-based consultancy. Jellicoe has been kind enough to Cambridge Masters to explain these latest “third generation” lidar sensors and their VCSEL array semiconductor lasers to me.

For the newbies: today’s lidar emits pulsed laser light into the environment, which then bounces back to the sensor and enables the device to capture its environment several times per second. This is key to any autonomous vehicle that drives and adapts to everything around it. (Tesla is the only notable obstacle to integrating lidar into automated driving.)

First generation lasers were super expensive, so lidar units generally spun them around or wagged them back and forth. Second generation units set them up on (sometimes tiny) moving mirrors or generated flash images for shorter range detection.

The iPhone 12 Pro now has a new type of lidar called a Vertical Cavity Surface Emitting Laser (VCSEL) to focus the camera. Here’s why they’re iPhone che:

Most semiconductor-based lasers are formed by cleaving semiconductor chips with light emitting diodes, with the laser beam emitted from the exposed edge. They are difficult to part, so there is a lot of Scr and time consuming to assemble on a PCB. With VCSELs, the laser light is emitted perpendicular to the chip surface from a cavity etched into this surface. This makes it much easier and more enjoyable to assemble a huge VCSEL array, making real solid-state lidar (no moving parts) affordable. These lasers are tiny enough to emit brighter, farther-reaching light without harming others’ eyes.

Jellicoe identified three key players: Opsys Technologies of Israel employs thousands of VCSELs, each addressing a specific point on a single-photon avalanche detector (SPAD) receiver chip, similar to that of a digital camera. Opsys flashes individual VCSELs one after the other, some of which can work with different light wave frequencies in order to avoid crosstalk. It scans the horizon 1,000 times per second and measures the time it takes for the light to bounce back to the receiver.

Downsampling the results to 30 frames / second for autonomous systems improves the resolution enough to identify the make of a vehicle in front. The maximum range is 200 meters with a relatively narrow horizontal field of view of 24 degrees. However, multiple units can be arranged to expand the field of view by stitching the images together using software. The lens optics can expand the target area to a shorter distance. Automakers envision using six to eight of these $ 200 units to fully perceive the surroundings of a vehicle. Opsys expects series production in the years 2023 to 2024.

Based in San Francisco Sensory photonics A whole 15,000 VCSEL array flashes simultaneously and illuminates 140,000 pixels on its SPAD collector. Their twist: mounting VCSELs on a curved surface to widen the field of view, but the maximum range of SP of 200 meters over a 30 degree horizontal field is achieved with a flat chip. The cost is currently estimated at “hundreds” of dollars with production expected by the end of 2024.

Coming from Hamburg, the IbeoNEXT Uses a 128 x 80 array of VCSELs (10,240 of them) and divides the difference between the two above: It flashes horizontal rows of VCSELs to generate a vertical line scan. Optical lens packages adjust the field of view and the range, with the largest range (up to 250 meters) covering a narrow horizontal field of view of 12 degrees. The widest field is 60 degrees with a much shorter range, with a 120 degree lens under development.

What’s next for lidar? Aurora and Aeva develop frequency-modulated continuous wave lidar, sometimes referred to as 4D lidar. Instead of measuring the time it takes for flashes of light to return, these lasers stay on and vary their frequency of light. The Doppler shift in the reflected light wave determines the distance and speed from each reflected pixel at distances of 300 meters or more.

These lasers and their waveguides are Not iPhone che, as many require steering as the first generation units. Expect this initially for commercial vehicles, which can tolerate higher prices and require greater warning distances to brake or swerve.

Jellicoe’s biggest reveal at CES 2021: News that Israeli computer vision company Mobileye – the people whose collaboration with Tesla convinced Elon Musk that radar and cameras could power autopilot on their own – are finally using an FMCW system- on-chip design enter the lidar business. Could this convince Elon himself of Lidar’s virtues until the next CES?

Leave a Reply