Symmetrical event-related EEG/fMRI information fusion in a variational Bayesian framework.
Daunizeau J., Grova C., Marrelec G., Mattout J., Jbabdi S., Pélégrini-Issac M., Lina J-M., Benali H.
In this work, we propose a symmetrical multimodal EEG/fMRI information fusion approach dedicated to the identification of event-related bioelectric and hemodynamic responses. Unlike existing, asymmetrical EEG/fMRI data fusion algorithms, we build a joint EEG/fMRI generative model that explicitly accounts for local coupling/uncoupling of bioelectric and hemodynamic activities, which are supposed to share a common substrate. Under a dedicated assumption of spatio-temporal separability, the spatial profile of the common EEG/fMRI sources is introduced as an unknown hierarchical prior on both markers of cerebral activity. Thereby, a devoted Variational Bayesian (VB) learning scheme is derived to infer common EEG/fMRI sources from a joint EEG/fMRI dataset. This yields an estimate of the common spatial profile, which is built as a trade-off between information extracted from EEG and fMRI datasets. Furthermore, the spatial structure of the EEG/fMRI coupling/uncoupling is learned exclusively from the data. The proposed data generative model and devoted VBEM learning scheme thus provide an un-supervised well-balanced approach for the fusion of EEG/fMRI information. We first demonstrate our approach on synthetic data. Results show that, in contrast to classical EEG/fMRI fusion approach, the method proved efficient and robust regardless of the EEG/fMRI discordance level. We apply the method on EEG/fMRI recordings from a patient with epilepsy, in order to identify brain areas involved during the generation of epileptic spikes. The results are validated using intracranial EEG measurements.