Apple files for — and is granted — lots of patents by the U.S. Patent & Trademark Office. Many are for inventions that never see the light of day. However, you never can tell which ones will materialize in a real product, so here are this week’s patent highlights:
It’s just a matter of time before iOS and OS X devices have flexible screens. Apple has been granted a patent (number 9,256,250) for “electronic devices with flexible screens having fastened bent edges.”
Apple says electronic devices such as laptops and smartphones usually have rigid displays made from rigid display structures that often include a significant amount of inactive border area for around the display for accommodating display circuitry for operating display pixels in an active region of the display. This type of wide inactive region tends to make displays bulky and requires the use of electronic device housings with wide bezels.
Naturally, Apple doesn’t like this bulky type of build and is at least considering the use of flexible display technologies that allow displays to be flexed. In the patent filing, the company says it’s “desirable to be able to minimize the width of the inactive region in a display and to otherwise improve displays for electronic devices.”
Upcoming iterations of tvOS may allow your Apple Watch to control the volume and other features of your iPhone. Apple has filed for a patent (number 20160044151) for “volume control for a mobile device using a wireless device.”
Per the invention, a wearable device (the Apple smartwatch) “can facilitate automatic adjustment of a volume control and/or other settings of a host device” (the iPhone) based on properties of the ambient environment. For example, when an iPhone (or iPad) generates an audible alert, an Apple Watch can detect whether the volume is outside acceptable levels and adjust th alert volume and/or other alert characteristics. Adjustments to host-device settings can also be made based on comparing audio signals collected by the host device and the wearable device.
Is Apple working on an “adaptive projector”? The company has been granted a patent (number 20160041625) for just that. It involves interactive reality augmentation, including a 2-dimensional camera and a 3-dimensional camera, associated depth projector and content projector, and a processor linked to the 3-dimensional camera and the 2-dimensional camera.
In the patent filing, Apple says that natural user interfaces are gaining momentum in the entertainment and computer industry. Gesture controls are supplementing or replacing more conventional and less natural interfaces such as keyboard and mouse, game controller, and remote control.
The user interactions, however, continue to relate largely to the computer monitor, thus limiting applicability and ease of use of such interfaces, according to Apple. The company is looking into gesture controls that rely on optical 3-dimensional mapping. A content projector would project a content image onto the 3-dimensional object responsively to instructions of the processor, which can be mediated by automatic recognition of user gestures
Finally, Apple has been granted a patent (number 9,256,322) for “multi-touch discrimination.” It would allow touch devices to differentiate between “contact types” such as fingertips, thumbs, palms, and cheeks (though why you’d be touching your iPad or iPhone to your cheek is beyond me).
By way of example, thumb contacts may be distinguished from fingertip contacts using a “patch eccentricity parameter.” In addition, by non-linearly deemphasizing pixels in a touch-surface image, a reliable means of distinguishing between large objects (e.g., palms) from smaller objects (e.g., fingertips, thumbs and a stylus) is described.
Apple says that, unlike earlier input devices, touch-surfaces now becoming available are capable of simultaneously detecting multiple objects as they approach and/or contact the touch-surface, and detecting object shapes in much more detail. To take advantage of this capability, it’s necessary to measure, identify and distinguish between the many kinds of objects that may approach or contact such touch-surfaces simultaneously.
Apple says that prior touch-surface systems don’t provide a “robust ability” to do this. The company wants to provide methods and devices that identify and discriminate multiple simultaneous hover or touch events such as, for example, two or more closely grouped fingers, palm heels from one or more fingers, fingers from thumbs, and fingers from ears and cheeks.