This project adds to non-invasive BCIs for communication for adults with severe speech and physical impairments due to neurodegenerative diseases. Researchers will optimize \& adapt BCI signal acquisition, signal processing, natural language processing, \& clinical implementation. BCI-FIT relies on active inference and transfer learning to customize a completely adaptive intent estimation classifier to each user's multi-modality signals simultaneously. 3 specific aims are: 1. develop \& evaluate methods for on-line \& robust adaptation of multi-modal signal models to infer user intent; 2. develop \& evaluate methods for efficient user intent inference through active querying, and 3. integrate partner \& environment-supported language interaction \& letter/word supplementation as input modality. The same 4 dependent variables are measured in each SA: typing speed, typing accuracy, information transfer rate (ITR), \& user experience (UX) feedback. Four alternating-treatments single case experimental research designs will test hypotheses about optimizing user performance and technology performance for each aim.Tasks include copy-spelling with BCI-FIT to explore the effects of multi-modal access method configurations (SA1.3a), adaptive signal modeling (SA1.3b), \& active querying (SA2.2), and story retell to examine the effects of language model enhancements. Five people with SSPI will be recruited for each study. Control participants will be recruited for experiments in SA2.2 and SA3.4. Study hypotheses are: (SA1.3a) A customized BCI-FIT configuration based on multi-modal input will improve typing accuracy on a copy-spelling task compared to the standard P300 matrix speller. (SA1.3b) Adaptive signal modeling will allow people with SSPI to typing accurately during a copy-spelling task with BCI-FIT without training a new model before each use. (SA2.2) Either of two methods of adaptive querying will improve BCI-FIT typing accuracy for users with mediocre AUC scores. (SA3.4) Language model enhancements, including a combination of partner and environmental input and word completion during typing, will improve typing performance with BCI-FIT, as measured by ITR during a story-retell task. Optimized recommendations for a multi-modal BCI for each end user will be established, based on an innovative combination of clinical expertise, user feedback, customized multi-modal sensor fusion, and reinforcement learning.
See this in plain English?
AI-rewrites the medical criteria so a patient or caregiver can understand them. Always confirm with the trial site.
Typing Accuracy
Timeframe: 12 data collection sessions over 12 weeks (1 session/week) to assess change
Typing Speed
Timeframe: 12 data collection sessions over 12 weeks (1 session/week) to assess change
Information transfer rate
Timeframe: 12 data collection sessions over 12 weeks (1 session/week) to assess change
User experience
Timeframe: 12 data collection sessions over 12 weeks (1 session/week) to assess change