Using Neural Signals to Drive Touch Screen Devices
Principal Investigator: Howard Chizeck
Millions of Americans suffer from impaired upper extremity function due to neurological or musculoskeletal disorders. Stroke, cerebral palsy, multiple sclerosis, Parkinson’s disease, and osteoarthritis are just a few examples of disorders that can affect upper extremity function, severely limiting independence and functions of daily life. While advances in technology have enabled the development of touch screens for mobile phones, computers, and tablets, interaction with these devices still requires upper extremity control and manual dexterity. Gesture-based interaction systems are also gaining popularity, but these fail to meet the needs of many Americans with limited upper extremity function.
UW inventors have developed a method for neural control of touch screen devices. This invention uses electromyographic (EMG) signals recorded from skin surface electrodes to mimic finger movements on touch screens. These signals can be recorded from any muscles that the user can control, allowing users with limited arm or hand function to control devices using other muscles (e.g. head or leg muscles). The signals can be translated to mimic touch signals and transmitted to wireless devices via Bluetooth or Wi-Fi, allowing the user to operate touch-based applications. The system can also be placed in a discreet location, such as a sleeve or cuff, to improve its aesthetic appeal.
• A method of receiving and translating EMG signals to touch signals for the operation of touch-enabled devices
• A method for identifying intended touch events based on EMG activity • A method for calibrating an EMG controlled device
• An EMG device consisting of EMG electrodes, a processor, amplifier, and filters capable of recording and filtering EMG signals, translating them to touch signals, and transmitting them wirelessly to a touch-enabled device Advantages
• Enables use of touch screen technology by people with limited upper extremity function
• Allows for customizable configurations based on a person’s neurological or musculoskeletal function
• Allows for translation of raw neural signals to touch signals prior to transmission to limit concerns of neural data security and allow for integration across multiple touch-enabled applications
For more info, contact: Ryan Buckmaster