The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory.
[From Corbetta et al 1991, J.Neurosci] d' is an index of discriminability that measures the separation between the means of the signal and noise distributions in units of the standard deviation of the noise distribution (Green and Swets, 1966). d' is computed by subtracting the Z score for false alarms from the Z score for hits.
In the equal-variance Gaussian model (i.e. under the assumption that both signal and noise distributions are Gaussian and have equal variance), d' denotes the mean of the signal distribution.
A higher d' means that the signal can be more readily detected.
An estimate of d' can be found from measurements of the hit rate and false-alarm rate.
Reference
edit- Elementary Signal Detection Theory by Thomas D. Wickens (Ch.2, p.20).