Loading [a11y]/accessibility-menu.js
Exploring Artificial Neural Network Models for c-VEP Decoding in a Brain-Artificial Intelligence Interface | IEEE Conference Publication | IEEE Xplore

Exploring Artificial Neural Network Models for c-VEP Decoding in a Brain-Artificial Intelligence Interface


Abstract:

The Conversational Brain-Artificial Intelligence Interface (BAI) is a novel brain-computer interface (BCI) that uses artificial intelligence (AI) to help individuals with...Show More

Abstract:

The Conversational Brain-Artificial Intelligence Interface (BAI) is a novel brain-computer interface (BCI) that uses artificial intelligence (AI) to help individuals with severe language impairments communicate. It translates users’ broad intentions into coherent, context-specific responses through an advanced AI conversational agent. A critical aspect of intention translation in BAI is the decoding of code-modulated visual evoked potentials (c-VEP) signals. This study evaluates five different artificial neural network (ANN) architectures for decoding c-VEP-based EEG signals in the BAI system, highlighting the efficacy of lightweight, shallow ANN models and pre-training strategies using data from other participants to enhance classification performance. These results provide valuable insights for the application of ANN models in decoding c-VEP-based EEG signals and may benefit other c-VEP-based BCI systems.
Date of Conference: 03-06 December 2024
Date Added to IEEE Xplore: 10 January 2025
ISBN Information:

ISSN Information:

Conference Location: Lisbon, Portugal

I. Introduction

The Conversational Brain-Artificial Intelligence Interface (BAI), a new type of brain-computer interface (BCI), leverages AI to enable users with severe language impairments to communicate effectively [1]. It operates by translating users’ high-level intentions into articulate, contextually appropriate responses using a sophisticated AI-driven conversational agent. The operation of BAIs begins with the acquisition of contextual data tailored to the user’s immediate environment, followed by probing for user intentions, often facilitated through conversational agents. These intentions are decoded from the brain’s signals and converted into actionable commands by the AI, enabling interaction with external environments. The BAI system consists of critical components like contextual input, cognitive probing, intention decoding, and action generation.

Contact IEEE to Subscribe

References

References is not available for this document.