University of Portsmouth researcher Paul Gnanayutham is working to create an inexpensive, easy-to-use interface that allows a computer to read, interpret and display thoughts and feelings based on eye movement, the use of face muscles and/or brain waves
People suffering from physically debilitating illnesses such as amyotrophic lateral sclerosis (aka Lou Gehrig's Disease) and traumatic brain injuries often find themselves trapped inside their own bodies, unable to speak, gesture or otherwise communicate with the outside world. Scientists have shown they can create computer interfaces that sense, interpret and display a locked-in person's brain waves, eye movements or facial expressions, but the challenge has been to find cost-effective ways of harnessing this technology for consumer use.
Source
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment