Today, the US Patent and Trademark Office has officially granted Apple a patent related to the use of optical fibers behind displays that can capture biometric data – fingerprint data – at much higher rates and accuracy.
It has been rumored for years that display fingerprint technology will come to the iPhone and every year analysts like Ming-Chi Kuo keep expanding their expectations.
With component shortages as hurting Apple as they are, it’s certainly not the time for Apple to introduce a high-end feature that might have component issues hampering iPhone shipments.
However, the patent granted by Apple states that it may be desirable to capture a two-dimensional (2D) or three-dimensional (3D) image of an object or user close to the device. In some cases, a 2D or 3D image may be an image of a fingerprint, a face, or a scene in a field of view (FoV).
Apple’s invention covers systems, devices, methods, and devices intended for optical sensing using optical fibers or optical fiber beams, and, in particular, for optical sensing near the screen.
More specifically, Apple notes that electronic devices often include optoelectronic components to provide lighting, sense the proximity of an object, take a picture, and so on. Unlike many solutions that place optoelectronic components under the screen and emit or receive electromagnetic radiation through the screen (which can result in 95-99% optical transmission losses), the technologies described in Apple’s patent application use optical fibers or optical fiber bundles to direct electromagnetic radiation between an optoelectronic component placed below, partially, or adjacent to a display, along the edge of the display (for example, into a gap between display and frame) , to an optically permeable component or surface area.
The area of the optically permeable component or surface may be adjacent to the display or near the perimeter of the screen. Electromagnetic radiation can be sensed/detected in a similar way.
Electromagnetic radiation emitted and detected next to a display screen, using one or more optical fibers, may be less attenuated than electromagnetic radiation emitted or detected by a display screen.
In the various embodiments described, the optical fiber(s) can have nonlinear paths; It can enable sensors with different transmit/receive baselines; They can be used in conjunction with a light emitter, photodetector or optical transceiver; It can be multiplexed, split or aggregated; It can be coupled with an optical path routing controller or any other component that can route optical fiber(s); And so on.
Apple patented FIG. 1A below is the front side of the iPhone. In some cases, the front camera #112 and/or other iPhone sensors may be located below, partially under or next to the display #104, and electromagnetic radiation emitted and/or received by the sensor(s) may be directed through a gap(s) adjacent to the screen.
An optical transmitter, detector, or transceiver may also be configured under or adjacent to the screen to be (or provide) the proximity sensor; 2D or 3D camera (sometimes in combination with a flood lighting device or regulated light source); biometric authentication sensor (for example, facial recognition sensor or fingerprint sensor); eye/gaze tracker, tracker device, or other optical tracking system; optical communication system and so on.
The iPhone may also have various input devices, including a virtual button #118 or other sensors integrated with a display assembly placed under the screen.
In Apple’s FIG patent. In two dimensions we can see optical fiber #242, or optical fiber bundle #244 including optical fiber.
Apple patented FIG. Figure 5a above shows the first example of a #500 fiber optic bundle configuration.
See Apple Patent No. 11327.237 for details.