Embedded vision technology, while still in its early stages, has seen significant advances recently. Many of these advances offer the prospect of radically transforming the way that people interact with their environments. Part of this transformation will result from the radical expansion of consumers' simultaneous access to information that embedded vision technology will enable. Embedded vision technology will allow unprecedented instantaneous access to information that will assist consumers as they go about their daily lives. Inherent in this concurrent, continuous access to relevant information will be enhanced visibility of the daily activities of individual consumers to corporations and even governments. The implications of this impending dynamic are as formidable for the marketer as they are potentially menacing for the private individual.
Embedded-vision technology refers to mechanical systems that can process optical data - machines that "see". The applications of this technology are extremely broad. Within the next five years it is likely that embedded vision systems will play increasing roles in diverse sectors including education, entertainment, games, healthcare, marketing, retail, surveillance, transportation, and others. Within the next 15 years, embedded vision systems will become as ubiquitous as wireless data systems are today.
Until recently the cost of this technology has been a significant obstacle to deployment. However this began to change about a year ago when Microsoft released the Kinect for Windows SDK for commercial use. Kinect for Windows furnishes systems designers and developers with robust embedded vision technology for roughly $150. The most popular use of Kinect among consumers is of course on the Xbox where it is deployed for use in games and entertainment. At about the same time that Microsoft released Kinect, HP began providing brand marketers with its Aurasma technology for free in order to grow the market for augmented reality implementations. As embedded vision technology has become more widely accessible, more companies, and non-commercial institutions have begun to experiment with its diverse applications.
The applications most relevant for marketers and that have already been deployed in numerous initiatives are as follows:
Augmented Reality - Augmented Reality applications use embedded vision systems in mobile devices, such as smartphones and tablet PCs, to generate rich user experiences from specific visual cues in the user's real environment. In one famous example, Shark Watches partnered with Aurasma to create a rich interactive ad on a mobile device triggered by a print ad in Surfer magazine. A demonstration can be viewed here. In a more recent example, Bloomingdale's has partnered with FaceCake to launch virtual dressing rooms in select stores. These are out-of-home display units that utilize FaceCake's Swivel technology to allow consumers to see how they would look in various products (e.g., apparel and cosmetic products) without actually trying them on. Swivel is also available for use in online retail.
Facial Recognition - Facial Recognition applications use embedded vision systems to detect user demographic and even psychographic information. Recently Kraft Foods and Intel Corp. partnered to launch a new line of Diji-Taste vending machines. These machines are able to determine, from visual cues, the age of the customer. The machine only distributes free samples if it identifies the customer as an adult. A demonstration of the machine is available here. In another example, Unilever and Coca Cola partnered with Affectiva and Millward Brown to better assess the emotional impact of their advertisements. Affectiva's technology evaluates ad effectiveness based on users' facial expressions. While the technology may never replace replace consumer surveys, it will certainly reduce the reliance of marketers on self-reported data for the testing of advertisements.
These applications are just a sample of the coming embedded vision revolution. The next wave of developments in augmented reality applications provide a hint at what ubiquitous embedded vision might yield for marketers as well as the perils it might entail for consumers. For some time now Google has been developing augmented reality glasses. These are spectacles that have an embedded display that provides additional information to the wearer about his or her environment through visual cues that the device processes using a camera. A video of Sergey Brin demonstrating the device is available here. To provide an idea of how fast this technology is developing, about a year ago a Google engineer announced that the company was looking at embedding displays in contact lenses. Less than a year later, researchers at the University of Ghent were able to successfully embed a multi-pixel LED display in a contact lens.
At some point in the next 10-15 years we will have simultaneous hands-free access to all of the information on the Internet about our real visible environment. When we meet someone we will have instantaneous access to all of the available information about that person simply by looking at him or her. When we see brands or products we will have concurrent access to advertising, product reviews, recommendations from social media sites and so forth. The corollary of this of course is that in a world of ubiquitous embedded vision systems, other people, in addition to corporations and governments, will have the same immediate access to information about us.
For more information about embedded vision systems, visit the Embedded Vision Alliance web site .