Thursday, January 20, 2022
NewsOpinionsPatents

Apple granted patent for an ‘enhanced virtual touchpad’ for a Mac

Pictured is s a schematic, pictorial illustration of a computer system implementing a non-tactile three-dimensional (3D) user interface.

Apple has been granted a patent (number 11,169,611) for an “enhanced virtual touchpad.” It would work with gaze and gesture input systems for a Mac

About the patent 

Obviously, many different types of user interface devices and methods are currently available such as a computer keyboard, a mouse, joysticks, and touch screens.Computer interfaces based on three-dimensional (3D) sensing of parts of a user’s body have also been proposed. For example, a gesture recognition system could use depth-perceptive sensors. The sensors could recognize gestures based on the shape of the body part and its position and orientation over an interval.

Another user input system could be an interactive video display system. It would incorporate a display screen that displays a visual image and a camera hat captures 3D information regarding an object in an interactive area located in front of the display screen. A computer system directs the display screen to change the visual image in response to changes in the object. 

What’s more 3D human interface systems may identify not only the user’s hands, but also other parts of the body, including the head, torso and limbs. Also, some user interface systems track the direction of the user’s gaze. 

An “enhanced virtual touchpad” could be used with such input systems.

Summary of the patent

Here’s Apple’s abstract of the patent: “A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.”

the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.
Exit mobile version