ALTEREGO_ MIT Media Lab

MIT Media Lab is working on a wearable device called AlterEgo, a non-invasive, wearable, peripheral neural interface that allows humans to converse in natural language with machines, artificial intelligence assistants, services, and other people without any voice – without opening their mouth, and without externally observable movements – simply by articulating words internally.

The feedback to the user is given through audio, via bone conduction, without disrupting the user’s usual auditory perception, and making the interface closed-loop. This enables an human-computer interaction that is subjectively experienced as completely internal to the human user – like speaking to one’s self. AlterEgo wearable system captures peripheral neural signals when internal speech articulators are volitionally and neurologically activated, during a user’s internal articulation of words. This enables a user to transmit and receive streams of information to and from a computing device or any other person without any observable action, in discretion, without unplugging the user from her environment, without invading the user’s privacy.

 

All Rights reserved to MIT Media Lab

Back to Top