An Apple patent granted today appears to describe a new approach to under-screen Face ID, which could also allow the company to embed additional sensors into an iPhone display.

The patent seemingly describes an evolution of the Dynamic Island approach, but applied in a more flexible way …

Background

Apple’s initial approach to embedding Face ID into the display was the famous notch, occupying the central area of the status bar at the top of an iPhone.

With the iPhone 14 Pro models, Apple switched instead to two cutouts in the display itself. These are disguised as a single pill-shaped cutout, and the company cleverly turned this into part of the user-interface by expanding the apparent width, and displaying contextual content in it.

The company’s long-term goal, however, has long been the “single slab of glass,” in which the display occupies the entirety of the front of the iPhone, which requires finding other ways to hide the under-screen components, like Face ID modules and front-facing camera.

Apple already holds a number of patents for under-screen Face ID and Touch ID, but the ability to completely hide them remains some way off yet.

However, a new patent granted today describes what appears to be an evolution of the Dynamic Island approach.

Under-screen Face ID patent

Patently Apple notes that part of the patent language simply describes the existing implementation of Dynamic Island. However, it then goes beyond this, in two ways.

First, the patent describes a huge range of sensors which could be embedded into the display, including gestures made without actually touching the screen.

The sensors under the display that could be hidden, include those for Touch ID, sensors for measuring three-dimensional non-contact gestures (“air gestures”), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors and more.

The sensors beneath the display could also include optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, and/or other sensors.

Second, the position of the Dynamic Island could vary, by using a series of tiny transparent windows whose apparent size and position can be effectively moved around the display, by selectively activating and deactivating different pixels.

The transparent windows may be shifted by a random amount in a random direction relative to a grid defining point and/or may be randomly rotated to increase the non-periodicity. A transparency gradient may be formed between the transparent windows and the surrounding opaque portion of the display. The transparent windows may be defined by non-linear edges.

Apple says that a typical display used by its devices has 13 layers, and that light transmission through these areas is reduced by as much as 80%, and therefore it might need to reduce the number of layers in areas containing sensors.

But the approach appears to describe a method of distributing these areas in a way that would render them invisible to the naked eye; without interfering with performance (eg. touch-sensitivity); and in a way which allows neighbouring pixels to be selectively switched off to increase light transmission.

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

Read More