Signed differential mapping
Encyclopedia
Signed differential mapping or SDM is a statistical technique for meta-analyzing
Meta-analysis
In statistics, a meta-analysis combines the results of several studies that address a set of related research hypotheses. In its simplest form, this is normally by identification of a common measure of effect size, for which a weighted average might be the output of a meta-analyses. Here the...

 studies on differences in brain
Brain
The brain is the center of the nervous system in all vertebrate and most invertebrate animals—only a few primitive invertebrates such as sponges, jellyfish, sea squirts and starfishes do not have one. It is located in the head, usually close to primary sensory apparatus such as vision, hearing,...

 activity or structure which used neuroimaging
Neuroimaging
Neuroimaging includes the use of various techniques to either directly or indirectly image the structure, function/pharmacology of the brain...

 techniques such as fMRI
Functional magnetic resonance imaging
Functional magnetic resonance imaging or functional MRI is a type of specialized MRI scan used to measure the hemodynamic response related to neural activity in the brain or spinal cord of humans or other animals. It is one of the most recently developed forms of neuroimaging...

, VBM, DTI or PET
Positron emission tomography
Positron emission tomography is nuclear medicine imaging technique that produces a three-dimensional image or picture of functional processes in the body. The system detects pairs of gamma rays emitted indirectly by a positron-emitting radionuclide , which is introduced into the body on a...

. It may also refer to a specific piece of software created by the SDM Project to carry out such meta-analyses.

Overview of the method

SDM adopted and combined various positive features from previous methods, such as ALE or MKDA, and introduced a series of improvements and novel features. One of the new features, introduced to avoid positive and negative findings in the same voxel
Voxel
A voxel is a volume element, representing a value on a regular grid in three dimensional space. This is analogous to a pixel, which represents 2D image data in a bitmap...

 as seen in previous methods, was the representation of both positive differences and negative differences in the same map, thus obtaining a signed differential map (SDM).

The method has three steps. First, studies and coordinates of cluster peaks (e.g. the voxel
Voxel
A voxel is a volume element, representing a value on a regular grid in three dimensional space. This is analogous to a pixel, which represents 2D image data in a bitmap...

s were the differences between patients and healthy controls were highest) are selected according to SDM inclusion criteria. Second, these coordinates are used to create an SDM map for each study. Finally, study maps are meta-analyzed
Meta-analysis
In statistics, a meta-analysis combines the results of several studies that address a set of related research hypotheses. In its simplest form, this is normally by identification of a common measure of effect size, for which a weighted average might be the output of a meta-analyses. Here the...

 using several different tests to complement the main outcome with sensitivity and heterogeneity
Study heterogeneity
In statistics, study heterogeneity is a problem that can arise when attempting to undertake a meta-analysis. Ideally, the studies whose results are being combined in the meta-analysis should all be undertaken in the same way and to the same experimental protocols: study heterogeneity is a term used...

 analyses.

Inclusion criteria

It is not uncommon in neuroimaging
Neuroimaging
Neuroimaging includes the use of various techniques to either directly or indirectly image the structure, function/pharmacology of the brain...

 studies that some regions (e.g. a priori regions of interest
Region of interest
A Region of Interest, often abbreviated ROI, is a selected subset of samples within a dataset identified for a particular purpose.For example:* on a waveform , a time or frequency interval...

) are more liberally thresholded
Statistical hypothesis testing
A statistical hypothesis test is a method of making decisions using data, whether from a controlled experiment or an observational study . In statistics, a result is called statistically significant if it is unlikely to have occurred by chance alone, according to a pre-determined threshold...

 than the rest of the brain
Brain
The brain is the center of the nervous system in all vertebrate and most invertebrate animals—only a few primitive invertebrates such as sponges, jellyfish, sea squirts and starfishes do not have one. It is located in the head, usually close to primary sensory apparatus such as vision, hearing,...

. However, a meta-analysis
Meta-analysis
In statistics, a meta-analysis combines the results of several studies that address a set of related research hypotheses. In its simplest form, this is normally by identification of a common measure of effect size, for which a weighted average might be the output of a meta-analyses. Here the...

 of studies with such intra-study regional differences in thresholds
Statistical hypothesis testing
A statistical hypothesis test is a method of making decisions using data, whether from a controlled experiment or an observational study . In statistics, a result is called statistically significant if it is unlikely to have occurred by chance alone, according to a pre-determined threshold...

 would be biased
Bias of an estimator
In statistics, bias of an estimator is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. Otherwise the estimator is said to be biased.In ordinary English, the term bias is...

 towards these regions, as they are more likely to be reported just because authors apply more liberal thresholds
Statistical hypothesis testing
A statistical hypothesis test is a method of making decisions using data, whether from a controlled experiment or an observational study . In statistics, a result is called statistically significant if it is unlikely to have occurred by chance alone, according to a pre-determined threshold...

 in them. In order to overcome this issue SDM introduced a criterion in the selection of the coordinates: while different studies may employ different thresholds, you should ensure that the same threshold throughout the whole brain was used within each included study.

Pre-processing of studies

After conversion of coordinates to Talairach space, an SDM map is created for each study within a specific gray or white matter template. This consists in recreating the clusters of difference by means of an un-normalized Gaussian Kernel, so that voxels closer to the peak coordinate have higher values. A rather large full-width at half-maximum
Full width at half maximum
Full width at half maximum is an expression of the extent of a function, given by the difference between the two extreme values of the independent variable at which the dependent variable is equal to half of its maximum value....

 (FWHM) of 25mm is used to account for different sources of spatial error, e.g. coregistration mismatch in the studies, the size of the cluster or the location of the peak within the cluster. Within a study, values obtained by close Gaussian kernels are summed, though values are limited to [-1,1] to avoid a bias
Bias of an estimator
In statistics, bias of an estimator is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. Otherwise the estimator is said to be biased.In ordinary English, the term bias is...

 towards studies reporting various coordinates in close proximity, as voxels can achieve rather large values.

Statistical comparisons

SDM provides several different statistical analyses in order to complement the main outcome with sensitivity and heterogeneity
Study heterogeneity
In statistics, study heterogeneity is a problem that can arise when attempting to undertake a meta-analysis. Ideally, the studies whose results are being combined in the meta-analysis should all be undertaken in the same way and to the same experimental protocols: study heterogeneity is a term used...

 analyses.
  • The main statistical analysis is the mean
    Mean
    In statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....

     analysis, which consists in calculating the mean
    Mean
    In statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....

     of the voxel
    Voxel
    A voxel is a volume element, representing a value on a regular grid in three dimensional space. This is analogous to a pixel, which represents 2D image data in a bitmap...

     values in the different studies. This mean
    Mean
    In statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....

     is weighted
    Weight function
    A weight function is a mathematical device used when performing a sum, integral, or average in order to give some elements more "weight" or influence on the result than other elements in the same set. They occur frequently in statistics and analysis, and are closely related to the concept of a...

     by the sample size
    Sample size
    Sample size determination is the act of choosing the number of observations to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample...

     so that studies with large sample sizes contribute more.

  • The descriptive analysis of quartiles describes the weighted
    Weight function
    A weight function is a mathematical device used when performing a sum, integral, or average in order to give some elements more "weight" or influence on the result than other elements in the same set. They occur frequently in statistics and analysis, and are closely related to the concept of a...

     proportion of studies with strictly positive (or negative) values in a voxel
    Voxel
    A voxel is a volume element, representing a value on a regular grid in three dimensional space. This is analogous to a pixel, which represents 2D image data in a bitmap...

    , thus providing a p-value-free measure of the effect size
    Effect size
    In statistics, an effect size is a measure of the strength of the relationship between two variables in a statistical population, or a sample-based estimate of that quantity...

    .

  • Subgroup analyses
    Subgroup analysis
    Subgroup analysis, in the context of design and analysis of experiments, refers to looking for pattern in a subset of the subjects....

     are mean analyses applied to groups of studies to allow the study of heterogeneity
    Study heterogeneity
    In statistics, study heterogeneity is a problem that can arise when attempting to undertake a meta-analysis. Ideally, the studies whose results are being combined in the meta-analysis should all be undertaken in the same way and to the same experimental protocols: study heterogeneity is a term used...

    .

  • Linear model
    Linear model
    In statistics, the term linear model is used in different ways according to the context. The most common occurrence is in connection with regression models and the term is often taken as synonymous with linear regression model. However the term is also used in time series analysis with a different...

     analyses (e.g. meta-regression) are a generalization of the mean analysis to allow comparisons between two or more groups and use of covariates, as well as the study of possible confounds
    Confounding
    In statistics, a confounding variable is an extraneous variable in a statistical model that correlates with both the dependent variable and the independent variable...

     by means of meta-regression. It must be noted that a low variability of the regressor is critical in meta-regressions, so they are recommended to be understood as exploratory and to be more conservatively thresholded (e.g. a threshold of 0.0001 or 0.0002 has been proposed).

  • Jack-knife analysis consists in repeating a test as many times as studies have been included, discarding one different study each time, i.e. removing one study and repeating the analyses, then putting that study back and removing another study and repeating the analysis, and so on. The idea is that if a significant brain region remains significant in all or most of the combinations of studies it can be concluded that this finding is highly replicable.


The statistical significance
Statistical significance
In statistics, a result is called statistically significant if it is unlikely to have occurred by chance. The phrase test of significance was coined by Ronald Fisher....

 of the analyses is checked by standard randomization tests
Monte Carlo method
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...

. It is recommended to use uncorrected p-values = 0.001, as this significance has been found in this method to be approximately equivalent to a corrected p-value = 0.05. A false discovery rate
False discovery rate
False discovery rate control is a statistical method used in multiple hypothesis testing to correct for multiple comparisons. In a list of rejected hypotheses, FDR controls the expected proportion of incorrectly rejected null hypotheses...

 (FDR) = 0.05 has been found in this method to be too conservative. Values in a Talairach label or coordinate can also be extracted for further processing or graphical presentation.

SDM software

SDM is software written by the SDM project to aid the meta-analysis of voxel-based
Voxel
A voxel is a volume element, representing a value on a regular grid in three dimensional space. This is analogous to a pixel, which represents 2D image data in a bitmap...

 neuroimaging
Neuroimaging
Neuroimaging includes the use of various techniques to either directly or indirectly image the structure, function/pharmacology of the brain...

 data. It is distributed as freeware
Freeware
Freeware is computer software that is available for use at no cost or for an optional fee, but usually with one or more restricted usage rights. Freeware is in contrast to commercial software, which is typically sold for profit, but might be distributed for a business or commercial purpose in the...

 including a graphical interface and a menu/command-line console. It can also be integrated as an SPM
Statistical parametric mapping
Statistical parametric mapping or SPM is a statistical technique created by Karl Friston for examining differences in brain activity recorded during functional neuroimaging experiments using neuroimaging technologies such as fMRI or PET...

extension.

External links

  • SDM software and documentation from the SDM Project.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK