Journal of Clinical EEG & Neuroscience, April 2005
|EEG Databases in Research and Clinical Practice: Current Status and Future Directions
Evian Gordon and Lukasz M. Konopka, Guest Editors
|Discovery and Integrative Neuroscience
Stephen H. Koslow
|Integrative Neuroscience: The Role of a Standardized Database
E. Gordon, N. Cooper, C. Rennie, D. Hermens and L. M. Williams
|The Australian EEG Database
M. Hunter, R. L. L. Smith, W. Hyslop, O. A. Rosso, R. Gerlach, J. A. P. Rostas, D. B. Williams and F. Henskens
|Use of Normative Databases and Statistical Methods in Demonstrating Clinical Utility of QEEG: Importance and Cautions
Leslie S. Prichep
|Long Latency Evoked Potential Database for Clinical Applications: Justification and Examples
Frank H. Duffy
|Clinical Database Development: Characterization of EEG Phenotypes
J. Johnstone, J. Gunkelman and J. Lunt
|EEG Mapping and Low-Resolution Brain Electromagnetic Tomography (LORETA) in Diagnosis and Therapy of Psychiatric Disorders: Evidence for a Key-Lock Principle
Bernd Saletu, Peter Anderer, Gerda M. Saletu-Zyhlarz and Roberto D. Pascual-Marqui
|Evaluation and Validity of a LORETA Normative EEG Database
R. W. Thatcher, D. North and C. Biver
Efficient Application of Internet Databases for New Signal Processing Methods
Hypothesis driven research has been shown to be an excellent model for pursuing investigations in neuroscience. The Human Genome Project demonstrated the added value of discovery research, especially in areas where large amounts of data are produced. Neuroscience has become a data rich field, and one that would be enhanced by incorporating the discovery approach. Databases, as well as analytical, modeling and simulation tools, will have to be developed, and they will need to be interoperable and federated.
This paper presents an overview of the development of the field of neuroscience databases and associate tools: Neuroinformatics. The primary focus is on the impact of NIH funding of this process. The important issues of data sharing, as viewed from the perspective of the scientist and private and public funding organizations, are discussed. Neuroinformatics will provide more than just a sophisticated array of information technologies to help scientists understand and integrate nervous system data. It will make available powerful models of neural functions and facilitate discovery, hypothesis formulation and electronic collaboration.
Most brain related databases bring together specialized information, with a growing number that include neuroimaging measures. This article outlines the potential use and insights from the first entirely standardized and centralized database, which integrates information from neuroimaging measures (EEG, event related potential (ERP), structural/functional MRI), arousal (skin conductance responses (SCR)s, heart rate, respiration), neuropsychological and personality tests, genomics and demographics: The Brain Resource International Database. It comprises data from over 2,000 “normative” subjects and a growing number of patients with neurological and psychiatric illnesses, acquired from over 50 laboratories (in the USA, United Kingdom, Holland, South Africa, Israel and Australia), all with identical equipment and experimental procedures.
Three primary goals of this database are to quantify individual differences in normative brain function, to compare an individual’s performance to their database peers, and to provide a robust normative framework for clinical assessment and treatment prediction. We present three example demonstrations in relation to these goals. First, we show how consistent age differences may be quantified when large subject numbers are available, using EEG and ERP data from nearly 2,000 stringently screened normative subjects. Second, the use of a normalization technique provides a means to compare clinical subjects (50 ADHD subjects in this study) to the normative database with the effects of age and gender taken into account. Third, we show how a profile of EEG/ERP and autonomic measures potentially provides a means to predict treatment response in ADHD subjects. The example data consists of EEG under eyes open and eyes closed and ERP data for auditory oddball, working memory and Go-NoGo paradigms. Autonomic measures of skin conductance (tonic skin conductance level, SCL, and phasic skin conductance responses, SCRs) were acquired simultaneously with central EEG/ERP measures.
The findings show that the power of large samples, tested using standardized protocols, allows for the quantification of individual differences that can subsequently be used to control such variation and to enhance the sensitivity and specificity of comparisons between normative and clinical groups. In terms of broader significance, the combination of size and multidimensional measures tapping the brain’s core cognitive competencies, may provide a normative and evidence-based framework for individually-based assessments in “Personalized Medicine.”
The Australian EEG Database is a web-based de-identified searchable database of 18,500 EEG records recorded at a regional public hospital over an 11-year period. Patients range in age from a premature infant born at 24 weeks gestation, through to people aged over 90 years. This paper will describe the history of the database, the range of patients represented in the database, and the nature of the text-based and digital data contained in the database. Preliminary results of the first two studies undertaken using the database are presented. Plans for sharing data from the Australian EEG database with researchers are discussed.
We anticipate that such data will be useful in not only helping to answer clinical questions but also in the field of mathematical modeling of the EEG.
The clinical utility of the EEG, especially in psychiatric, learning and cognitive disorders, has been greatly enhanced by the use of quantitative analysis (QEEG) and comparisons to a normative database. Of primary importance in the use of such a reference database are the following considerations and cautions: adequate sampling across a broad age range; consideration of inclusion/exclusion criteria; adequate sample of artifact-free data to demonstrate reliability and replicability of norms; demonstration of specificity and sensitivity. A normative database meeting these criteria allows the multivariate description of patterns of QEEG abnormalities in patients as compared to age appropriate normative values, and the exploration of neurophysiological heterogeneity within populations. Demonstrations of the clinical significance of this approach exist in the scientific literature and demonstrate that QEEG provides high sensitivity and specificity to abnormalities in brain function seen in psychiatric populations.
We summarize our experience with the clinical utility of long latency evoked potential (EP) data in clinical qEEG studies. In contrast to common wisdom, such EP data are consistent across appropriately chosen age groups. In a healthy adult population, EP data correlate consistently with independently collected psychological variables. In our pediatric referral population, EP data are of greatest and most unique value in the learning disabilities but also augment detection of abnormality in epilepsy and behavioral abnormality. Selection of subjects for a clinical database on the basis of examined medical, neurological and behavioral health, forms adequately consistent groupings for clinical utility. The use of the Z-SPM is essential for detection of EP abnormality. A minimum of three replications within a clinical study protects against chance/false positives. Also, the true data dimensionality within EP data sets is far less than the total number of variables typically collected.
We propose development of evidence-based methods to guide clinical intervention in neurobehavioral syndromes based on categorization of individuals using both behavioral measures and quantification of the EEG (qEEG). Review of a large number of clinical EEG and qEEG studies suggests that it is plausible to identify a limited set of individual profiles that characterize the majority of the population. Statistical analysis has already been used to document “clusters” of qEEG features seen in populations of psychiatric patients.1 These clusters are considered here as intermediate phenotypes, based on genetics, and are reliable indices of brain function, not isomorphic with DSM categories, and carry implications for therapeutic intervention. We call for statistical analysis methods to be applied to a broad clinical database of individuals diagnosed with neurobehavioral disorders in order to empirically define clusters of individuals who may be responsive to specific neurophysiologically based treatment interventions, namely administration of psychoactive medication and/or EEG neurofeedback. A tentative set of qEEG profiles is proposed based on clinical observation and experience. Implication for intervention with medication and neurofeedback for individuals with these neurophysiological profiles and specific qEEG patterns is presented.
Different psychiatric disorders, such as schizophrenia with predominantly positive and negative symptomatology, major depression, generalized anxiety disorder, agoraphobia, obsessive-compulsive disorder, multi-infarct dementia, senile dementia of the Alzheimer type and alcohol dependence, show EEG maps that differ statistically both from each other and from normal controls. Representative drugs of the main psychopharmacological classes, such as sedative and non-sedative neuroleptics and antidepressants, tranquilizers, hypnotics, psychostimulants and cognition-enhancing drugs, induce significant and typical changes to normal human brain function, which in many variables are opposite to the above-mentioned differences between psychiatric patients and normal controls. Thus, by considering these differences between psychotropic drugs and placebo in normal subjects, as well as between mental disorder patients and normal controls, it may be possible to choose the optimum drug for a specific patient according to a key-lock principle, since the drug should normalize the deviant brain function. This is supported by 3-dimensional low-resolution brain electromagnetic tomography (LORETA), which identifies regions within the brain that are affected by psychiatric disorders and psychopharmacological substances.
To evaluate the reliability and validity of a Z-score normative EEG database for Low Resolution Electromagnetic Tomography (LORETA), EEG digital samples (2 second intervals sampled 128 Hz, 1 to 2 minutes eyes closed) were acquired from 106 normal subjects, and the cross-spectrum was computed and multiplied by the Key Institute’s LORETA 2,394 gray matter pixel T Matrix. After a log10 transform or a Box-Cox transform the mean and standard deviation of the *.lor files were computed for each of the 2,394 gray matter pixels, from 1 to 30 Hz, for each of the subjects. Tests of Gaussianity were computed in order to best approximate a normal distribution for each frequency and gray matter pixel. The relative sensitivity of a Z-score database was computed by measuring the approximation to a Gaussian distribution. The validity of the LORETA normative database was evaluated by the degree to which confirmed brain pathologies were localized using the LORETA normative database.
Log10 and Box-Cox transforms approximated Gaussian distribution in the range of 95.64% to 99.75% accuracy. The percentage of normative Z-score values at 2 standard deviations ranged from 1.21% to 3.54%, and the percentage of Z-scores at 3 standard deviations ranged from 0% to 0.83%. Left temporal lobe epilepsy, right sensory motor hematoma and a right hemisphere stroke exhibited maximum Z-score deviations in the same locations as the pathologies.
We conclude: (1) Adequate approximation to a Gaussian distribution can be achieved using LORETA by using a log10 transform or a Box-Cox transform and parametric statistics, (2) a Z-Score normative database is valid with adequate sensitivity when using LORETA, and (3) the Z-score LORETA normative database also consistently localized known pathologies to the expected Brodmann areas as an hypothesis test based on the surface EEG before computing LORETA.
This paper highlights the ways in which Internet databases may be efficiently used to foster the application of progress in biomedical sciences via data sharing and new algorithms. Employing the Internet to accelerate the pace of interdisciplinary research has significant potential, yet as with all new technologies, the first applications often cause more disappointment than positive outcomes. We discuss examples of solutions to the basic issues: (1) finding the relevant datasets (in portals connected via the Inter-neuro infrastructure), (2) reading the particular format in which the data was stored (using the SignalML language for metadescription of time series), (3) choosing the right method for the data analysis (we provide a brief review of the methods used for the analysis of EEGs, and discuss two of them in detail: Directed Transfer Function and Matching Pursuit), and (4) sharing the software for chosen methods of analysis (via repositories such as the eeg.pl thematic portal).