11.09 Brain-To-Sound Computer Interfaces: Neurofeedback of Music for Entrainment, Interaction and Neurorehabilitation
In cognitive neuroscience, sound sequences are used as abstracted models for temporal and sensorimotor processing in individuals and multi-agent interactions. In particular, use of musical stimuli permits the study of perception-action coupling as well as joint action and entrainment via reciprocal prediction and adaptation. In engineering, Brain-Computer Interfaces (BCI) and Neurofeedback applications have been developed for providing patients with alternative pathways of communication and interaction, as well as innovative neuro-rehabilitation treatment protocols. Bridging these two fields, the goal of this project is the interdisciplinary development of a Brain-Computer Interface platform that allows human subjects to interact directly and continuously with synthesized sound and music stimuli. Sound synthesis will be informed by contemporary understanding of the roles of certain parameters (e.g. rhythm, harmonics, and timbre) in entrainment. As such, this platform will allow studying the human brain in closed loop, e.g. the underlying neural dynamics of specific aspects of entrainment, such as temporal prediction and anticipation, synchronization and adaptation. Aside from the technical aspect of developing this BCI, an important part of this project will be its validation with respect to the above hypotheses in a series of studies. Potential contributions of this project are expected to be in human-machine interaction as well as neurorehabilitation.
Cheng G., Ehrlich S.K., Lebedev M., Nicolelis M.A.L. (2020): "Neuroengineering challenges of fusing robotics and neuroscience", Science Robotics, 5, 49, eabd1911, DOI: 10.1126/scirobotics.abd1911
Ehrlich S.K., Cheng G. (2019): "A computational model of human decision making for assessment of co-adaptation in neuro-adaptive human-robot interaction", 2019 International Conference on Systems, Man, and Cybernetics (SMC), 274-281, IEEE, DOI: 10.1109/SMC.2019.8913872
Ehrlich S.K., Agres K.R., Guan C., Cheng G. (2019): "A closed-loop, music-based brain-computer interface for emotion mediation", PLoS ONE, 14, 3, e0213516 , DOI: 10.1371/journal.pone.0213516
Ehrlich S., Guan C., Cheng G. (2017): "A closed-loop Brain-Computer Music Interface for continuous affective interaction", 2017 International Conference on Orange Technologies (ICOT), 176-179, IEEE, DOI: 10.1109/ICOT.2017.8336116
Ehrlich, S., Alves-Pinto, A., Lampe, R., & Cheng, G. (2017). A simple and practical sensorimotor EEG device for recording in patients with special needs. In Neurotechnix2017, CogNeuroEng 2017, Symposium on Cognitive Neural Engineering. DOI: 10.5220/0006559100730079
Team
Project team leader
Stefan Ehrlich
Institute for Cognitive Systems
Doctoral Researcher
Alireza Malekmohammadi
Institute for Cognitive Systems
Doctoral Researcher
Jessica Jacobs
Georgetown University
Doctoral Researcher
Constantin Uhde
Institute for Cognitive Systems
Principal investigator
Dr. Jessica Phillips-Silver
Georgetown University Medical Center
Principal investigator
Professor Josef Rauschecker
Georgetown University Medical Center