NewsOpinionsPatents

Apple patent filing involves ‘extremity and eye tracking’ on ‘Apple Glasses’

Apple has filed for a patent (number 20210216146) with the mouthful title of “positioning a user-controlled spatial selector based on extremity tracking information and eye tracking information.” It could apply to a variety of devices, including the Mac. However, it most likely involves “Apple Glasses,” the company’s rumored augmented reality/virtual reality head-mounted display.

About the patent filing

The patent filing involves a device with a display that can display a computer-generated reality (CGR) environment. Apple says that  current applications don’t provide a mechanism for accurately determining a virtual spatial location of a virtual contact between a virtual object and a user-controlled spatial selector. The tech giant says that current applications also don’t provide a mechanism for accurately determining when the virtual contact occurs. For example, some systems utilize extremity tracking to estimate a position of a user’s extremities relative to the virtual object. 

Apple says that, however, the estimate provided by the extremity tracking is inaccurate, and therefore the assessment as to whether the user is selecting the virtual object is likewise inaccurate. The tech giant wants its HMD to accurately measure the interaction between virtual and real objects. One way of doing this is by eye tracking.

Summary of the patent filing

Here’s Apple’s summary of the patent filing: “A method includes detecting, via a first one of a plurality of input devices, a primary input directed to a first candidate virtual spatial location of a computer-generated reality (CGR) environment. The first candidate virtual spatial location is an output of an extremity tracking function based on the primary input. 

“The method includes detecting, via a second one of the plurality of input devices, a secondary input directed to a second candidate virtual spatial location of the CGR environment. The second candidate virtual spatial location is an output of an eye tracking function based on the secondary input. The method includes positioning a user-controlled spatial selector to a virtual spatial location of the CGR environment as a function of the first and second candidate virtual spatial locations.”

About Apple Glasses

When it comes to Apple Glasses, such a device will arrive this year or 2022, depending on which rumor you believe. The Sellers Research Group (that’s me) thinks Apple will at least preview it before the end of the year. 

It will be a head-mounted display. Or may have a design like “normal” glasses. Or it may be eventually be available in both. The Apple Glasses may or may not have to be tethered to an iPhone to work. Other rumors say that Apple Glasses could have a custom-build Apple chip and a dedicated operating system dubbed “rOS” for “reality operating system.”

The accompanying mock-up is courtesy of Mac O’Clock.

the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.
Exit mobile version