We've moved a long way from the keyboard-and-mouse only computer interface. Touchscreens, speech, and even typing systems that track eye and muscle movements all aid in interaction. But what if we could literally transmit our thoughts to computer interfaces--without any sort of implanted computer chip? Intel's Human Brain project, a collaboration with Carnegie Mellon University and the University of Pittsburgh, is attempting to do just that.
The ambitious project (see our report from last year) uses EEG, fMRI, and magnetoencephalography to deduce what a subject is thinking about based on their pattern of neural activity. The process is still fairly primitive--it only works with concrete nouns within a 1,000-word vocabulary, and it can only tell the difference between two nouns at a time. In other words, the algorithm can't yet deduce on its own if a user is thinking of the word "arm," but it can figure out whether a user is thinking of "arm" or "shirt"(or any number of other nouns). Intel tells us that the algorithm is accurate 9 out of 10 times.
~ more... ~
No comments:
Post a Comment