Nope, no Apple VR goggles on Sept. 9 (but I have a suggestion for 'iGlasses')

In a Seeking Alpha article, analyst Mark Hibben ponders whether Apple might unveil VR goggles at its Sept. 9 media event. I'm 99.9% sure that's not going to happen. If/when that day comes I have some suggestions for Tim Cook & Company.

"Apple is known to have been hiring engineers ( to work in the [VR] field," Hibben writes. "Apple is also known to have obtained a patent ( for VR goggles that would use an iPhone as the display unit:

Should Apple ever tackle its own version of Google Glass there might be a market. Eventually. And, as mentioned, I have a suggestion to really make 'em special. (Apple would probably call such a product "Apple Glasses," but I'm going with "iGlasses" cause it's just more fun to write.)

The Gartner research group says that, although the adoption of augmented reality (AR) in the enterprise is still in its infancy, AR technology has matured to a point where organizations can use it as an internal tool to complement and enhance business processes, workflows and employee training. Gartner thinks that R facilitates business innovation by enabling real-time decision making through virtual prototyping and visualization of content.

"Augmented reality is the real-time use of information in the form of text, graphics, audio and other virtual enhancements integrated with real-world objects," according to Tuong Huy Nguyen, principal research analyst at Gartner. "AR leverages and optimizes the use of other technologies such as mobility, location, 3D content management and imaging and recognition. It is especially useful in the mobile environment because it enhances the user's senses via digital instruments to allow faster responses or decision-making."

He says that AR is particularly powerful for:

° Discovering things in the vicinity — for example, enclosed objects generating heat;

° Presenting real-world objects of potential special interest — for example, detecting and highlighting objects generating higher than normal levels of radiation;

° Showing a user where to go or what to do — for example, helping a worker make a repair in a hazardous environment where visibility is low;

° Providing additional information about an object of interest — for example, distance, size or level of danger.

AR services use various device sensors to identify the users' surroundings. Current implementations generally fall into one of two categories — location-based or computer vision. Location-based offerings use a device's motion sensors to provide information based on a user's location. Computer-vision-based services use facial, object and motion tracking algorithms to identify images and objects. For example, being able to identify a shoe among numerous objects on a table, Google Goggles (imaged-based search), or optical character recognition (OCR).

"AR is most useful as a tool in industries where workers are either in the field, do not have immediate access to information, or jobs that require one or both hands and the operator's attention," adds Nguyen. "As such, the impact on weightless industries is lower because these employees often have constant and direct access to the information they need (such as knowledge workers)."

Of course, Apple isn't going to release any type of smart glasses until they have a device that's going to catch on with general consumers. As for me, if Apple released iGlasses designed for, as well as AR, vision problems, I'm in.

For example, how about smart glasses for folks like me with mild vision problems that would automatically adjust to what you're eyes are focusing on? In other words, they'd automatically adjust for near vision when reading, intermediate vision when working on a computer, and distance vision when looking at things, well, in the distance.

Such technology already exists in its infancy. Imagine this and AR combined in iGlasses.