New SDK offers program control through the user's scalp

At the Game Developers Conference in San Francisco, Emotiv is showing off its headset controller, the EPOC, a unit which reads cognitive actions and facial expressions.

The device works through a series of electromagnetic sensors which contact the wearer's scalp. Depending upon the application, the sensors may be used to read emotion, facial expression, cognitive action, and the abstract category of "visualization." It currently has the capability to recognize 30 discrete conditions. For emotions, the device acts as a sort of primitive mood ring, using brainwave patterns to tell whether its wearer is excited, calm, tense, or frustrated.

It can detect a number of facial expressions: the wearer's eye position, as well as the attitude of his mouth and brows. Demos show the device being linked to an avatar whose facial expression changes according to the wearer's. The position of the user's head can be mapped to control-pad directions or on-screen perspective changes.

The brainwave pattern-matching activity of the device is described, rather rudimentarily, as the user manipulating on-screen images by "thinking." Here, the headset takes a set of brainwave data and assigns to it an onscreen action. Users are invited to assume any state of mind, and then asked to re-create that state to make an image appear on screen.

Emotiv EPOC headset controller

Emotiv has launched two SDKs for developers to support the headset, one of which is freely downloadable, called Emotiv SDKLite. The company's commercially-available kit was first announced at last year's GDC. Emotiv expects it to be available by Christmas for $299.

This isn't exactly territory that no hardware developer has ever crossed before. Since 2004, NeuroSky has been working on similar technologies. Last week, that company announced the commercial availability of its biofeedback headset, the MindSet, which has had an SDK available since mid-2007. It also has partnerships with Sega Toys, Japan, and French company Musinaut.

The "brainwave" interface has existed for well over twenty years, more or less as a gimmick in the video game community. Back in 1984, the Atari 2600's Mindlink Controller -- which never saw commercial release -- utilized infra-red sensors on the user's head to control in-game motion.

In the scientific field, a DARPA-funded research project at Columbia University is working on a Visual Computer Interface (VCI) that takes electroencephalographic (EEG) readings to marry photographic images with brainwaves of the viewer looking at them. The project actually cites both Emotiv and NeuroSky's efforts in the field as major contributions to the relevance of human/computer interaction (HCI).

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.