Abstract:
The metaverse, which integrates physical and virtual realities through technologies such as high-speed internet, virtual and augmented reality, and artificial intelligenc...Show MoreMetadata
Abstract:
The metaverse, which integrates physical and virtual realities through technologies such as high-speed internet, virtual and augmented reality, and artificial intelligence (AI), offers transformative prospects across various fields, particularly healthcare. This integration introduces a new paradigm in AI-driven medical imaging, particularly in assessing brain age'a crucial marker for detecting age-related neuropathologies such as Alzheimer's disease (AD) using magnetic resonance imaging (MRI). Despite advances in deep learning for estimating brain age from structural MRI (sMRI), incorporating functional MRI (fMRI) data presents significant challenges due to its complex data structure and the noisy nature of functional connectivity measurements. To address these challenges, we present the Multitask Adversarial Variational Autoencoder (M-AVAE), a bespoke deep learning framework designed to enhance brain age predictions through multimodal MRI data integration. The M-AVAE uniquely separates latent variables into generic and unique codes, effectively isolating shared and modality-specific features. Additionally, integrating multitask learning with sex classification as a supplementary task enables the model to account for sex-specific aging nuances. Evaluated on the OpenBHB dataset'a comprehensive multisite brain MRI aggregation-the M-AVAE demonstrates exceptional performance, achieving a mean absolute error of 2.77 years, surpassing conventional methodologies. This success positions M-AVAE as a powerful tool for metaverse-based healthcare applications in brain age estimation. The source code is made publicly available at: https://github.com/engrussman/MAVAE.
Published in: IEEE Journal of Biomedical and Health Informatics ( Early Access )