When Apple introduced their latest iPad Pro, they introduced the integration of a custom-designed LiDAR Scanner that Apple has claimed opens up tremendous possibilities for augmented reality and beyond. They further noted that the LiDAR Scanner works with the iPhone 11 Pro cameras to measure depth.
Apple further noted that with the new LiDAR scanner, placing AR objects on the iPad Pro display is instant. Realistic object occlusion allows AR objects to pass in front of and behind real-world structures. Improved motion capture and people occlusion are more accurate than ever. And developers will be able to create even more immersive experiences.
(Click on image to slightly enlarge)
Today the US Patent & Trademark Office published a patent application from Apple that relates to opto-electronic devices, and particularly to light detection and ranging (LiDAR) sensors. The invention covers the technology that was released with the latest iPad Pro.
Apple notes that the quality of the measurement of the distance to each point in a target scene (target scene depth) using a LiDAR is often compromised in practical implementations by a number of environmental, fundamental, and manufacturing challenges. An example of environmental challenges is the presence of uncorrelated background light, such as solar ambient light, in both indoor and outdoor applications.
Fundamental challenges are related to losses incurred by optical signals upon reflection from the target scene surfaces, especially due to low-reflectivity target scenes and limited optical collection aperture, as well as electronic and photon shot noises.
These limitations often generate inflexible trade-off relationships that typically push the designer to resort to solutions involving large optical apertures, high optical power, narrow field-of-view (FoV), bulky mechanical construction, low frame rate, and the restriction of sensors to operate in controlled environments.
The embodiments of the present invention address the above limitations so as to enable compact, low-cost LiDARs achieving accurate high-resolution depth imaging that can operate in uncontrolled environments. The disclosed embodiments use one or more pulsed laser sources emitting beams to generate high-irradiance illumination spots at the intersections of the axes of the emitted beams with the target scene.
The beams and hence the illumination spots are scanned across the target scene. The illumination reflected from the target scene is imaged by collection optics onto and detected by a time-of-flight, single-photon detector array for high signal-to-noise ratio, with the distance to each point of the target scene derived from the time-of-flight data.
Apple’s patent FIG. 1 below is a schematic illustration of a LiDAR system; FIG. 3 is a block diagram showing components of a sensing element in a single-photon avalanche diode (SPAD) array; and FIG. 13 is a schematic illustration of a LiDAR device using two laser light sources with different emissive powers which adapts itself for measuring distances to both near and far target scene points.
Apple’s patent application 20200158831 that was published today by the U.S. Patent Office was filed back in late Q4 2019. For engineers wanting to dig a little deeper into the details behind Apple’s custom-designed LiDAR Scanner click here. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
It’s always interesting to see the engineering talent that went into inventing something as important as a custom-designed LiDAR Scanner.
Alexander Shpunt: Architect. Was the CTO of PrimeSense that Apple Acquired in late December 2013.
Matt Waldon: Director, Depth Hardware
Gennadiy Agranov: Director, Imaging Sensors Technology
Thierry Oggier: Apple Camera & Depth Architecture
Cristiano Niclass: HW Development Manager