Developing A Hands-Free Human-Computer Interface That Senses Tooth/Jaw-Movements With Headphones

Authors

  • Krishna Jha Academies of Loudoun
  • Prahas Gantaram
  • Ethan Kantz

DOI:

https://doi.org/10.47611/jsr.v11i4.1792

Keywords:

Headphone HCI, Hands-free HCI, human computer interaction

Abstract

People with motor disabilities such as amputated limbs are often incapable of using computers traditionally. Instead, they rely on other ways of computer use relying on sense facial movements by sending the signals from the sensor to an interface, and finally translating the signals into computer commands. The goal of this research is to create a hands-free Human Computer Interface (HCI) that detects sounds near the ears from teeth and jaw movements (e.g., clicking or grinding), and translates it into computer commands. Our research aims to make computers accessible to a wider range of people, as it does not require one to use their hands. The novelty of the proposed HCI is how it gathers input, which it does by receiving signals strictly from headphones, making it inexpensive and user-friendly. Our goal is to create a proof-of principle setup to show the feasibility of this HCI. To achieve this, a pair of headphones were wired to an m-audio amplifier, and the sound settings on the computer were set so that both the speaker/microphone were using the m-audio amplifier. A script written with Octave software plotted the recorded signals from the amplifier (recorded with headphones). While the script was running, facial movements, such as a tooth click, or jaw opening performed by the user generated audio signals that were recorded by the computer. The script then mapped the frequency distribution of the recorded waves. Our work so far shows that headphones can indeed gather inputs of different facial movements.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

References or Bibliography

Centers for Disease Control and Prevention. (2017). Amyotrophic lateral sclerosis. CDC. https://www.cdc.gov/als/WhatisALS.html

Centers for Disease Control and Prevention. (2020). What is Muscular Dystrophy?. CDC. https://www.cdc.gov/ncbddd/musculardystrophy/facts.html

Cohen, O., Doron, D., Koppel, M., Malach, R., & Friedman, D. (2019). High Performance BCI in Controlling an Avatar Using the Missing Hand Representation in Long Term Amputees. SpringerBriefs in Electrical and Computer Engineering, 93-101. https://doi.org/10.1007/978-3-030-05668-1_9

Murphy, D. P., Bai, O., Gorgey, A. S., Fox, J., Lovegreen, W. T., Burkhardt, B. W., Atri, R., Marquez, J. S., Li Q., & Fei, D. (2017). Electroencephalogram-Based Brain–Computer Interface and Lower-Limb Prosthesis Control: A Case Study. Frontiers in Neurology, 8. https://doi.org/10.3389/fneur.2017.00696

National Aphasia Association. (2021, February 17). Aphasia Fact Sheet. NAA. Retrieved from https://www.aphasia.org/aphasia-resources/aphasia-factsheet/?gclid=CjwKCAiAmrOBBhA0EiwArn3mfGvkt5YjhR1aqyd_lumcFWkf7d6rvXr8xbO8bHuFODfwC1CRcGhPABoComcQAvD_BwE

National Institute on Deafness and Other Communication Disorders. (2017). Aphasia. NIH. https://www.nidcd.nih.gov/health/aphasia

Prakash, J., Yang, Z., Wei, Y. L., Hassanieh, H., Chaudhary, R. R. (2020). EarSense: earphones as a teeth activity sensor. Proceedings of the 26th Annual International Conference on Mobile Computing and Networking. https://doi.org/10.1145/3372224.3419197

Shensa, M.J. (1992). The Discrete Wavelet Transform: Wedding the A Trous and Mallat Algorithms. IEEE Transactions on Signal Processing, 40(10), 2464–2482. https://doi.org/10.1109/78.157290

Published

03-09-2023

How to Cite

Jha, K., Gantaram, P. ., & Kantz, E. (2023). Developing A Hands-Free Human-Computer Interface That Senses Tooth/Jaw-Movements With Headphones. Journal of Student Research, 11(4). https://doi.org/10.47611/jsr.v11i4.1792

Issue

Section

Research Projects