We use cookies to store user settings and preferences on this site. We do not preserve any personal data in cookies.

Using Neural Signals to Drive Touch Screen Devices

This technology uses an electromyography device to acquire neural signals and translate it into finger movements and decode gestures using mobile devices. This is effective for identifying neurological and musculoskeletal disease in those that survived a stroke.
Technology No. BDP 8679
What is the Problem?

For millions of Americans, the ability to effectively use their hands is limited due to neurological or musculoskeletal diseases. For example, over 6 million Americans are stroke survivors, and even highly recovered stroke survivors have significant residual disability in hand function impacting the ability to perform basic and instrumental activities of daily living. Devices that enable improved functioning for individuals with limited hand use could significantly impact a large proportion of the population.

What is the Solution?

This invention will enable people who have limited finger dexterity to utilize smart phones, tablets and other mobile devices that have a touch screen interface. This is done by acquiring neural signals from an electromyography device, calibrating and processing the resulting signals, translating them into the equivalent of finger movements, and then broadcasting them to a smartphone or tablet, using the Bluetooth protocol. This will allow the user to interact with standard functions and apps of the device, using an interface programmed within the mobile device.

What Differentiates it from Solutions Available Today?

Gestures can be used as control inputs for human computer interaction. Typical gesture-based systems try to decode the full gesture being performed by the user. This has a direct natural learning curve for the person, as they perform hand or body gestures and the device does the heavy lifting to compute the motion. However, decoding gestures is technically complex, often requiring a relatively large number of electromyography (EMG) electrodes. Further, gestures may not map directly to the input signals used by mobile devices. Additionally, gestures are not common across populations of users with varying degrees of motor ability. This method directly translates neural signals into finger motion, eliminating errors that can occur with the gesture based approach.

  • expand_more mode_edit Authors (1)
    Howard Chizeck
  • expand_more cloud_download Supporting documents (1)
    Product brochure
    Using Neural Signals to Drive Touch Screen Devices.pdf
Questions about this technology?