Thursday, July 7, 2016

40,000 fMRI Studies: Trashed

Due to a bad assumption used in the statistical analysis of fMRI data, it has been determined that there is a 70% false positive rate for the automatic determination of the value of a "voxel" (smallest unit of granularity). This apparently is because of the assumption of a Gaussian distribution for all the clusters of data, which is not the case in real life. This fully invalidates a huge swath of fMRI studies:
The Future of fMRI.

It is not feasible to redo 40,000 fMRI studies, and lamentable archiving and data-sharing practices mean most could not be reanalyzed either. Considering that it is now possible to evaluate common statistical methods using real fMRI data, the fMRI community should, in our opinion, focus on validation of existing methods. The main drawback of a permutation test is the increase in computational complexity, as the group analysis needs to be repeated 1,000–10,000 times. However, this increased processing time is not a problem in practice, as for typical sample sizes a desktop computer can run a permutation test for neuroimaging data in less than a minute (27, 43). Although we note that metaanalysis can play an important role in teasing apart false-positive findings from consistent results, that does not mitigate the need for accurate inferential tools that give valid results for each and every study.

Finally, we point out the key role that data sharing played in this work and its impact in the future. Although our massive empirical study depended on shared data, it is disappointing that almost none of the published studies have shared their data, neither the original data nor even the 3D statistical maps. As no analysis method is perfect, and new problems and limitations will be certainly found in the future, we commend all authors to at least share their statistical results [e.g., via NeuroVault.org (44)] and ideally the full data [e.g., via OpenfMRI.org (7)]. Such shared data provide enormous opportunities for methodologists, but also the ability to revisit results when methods improve years later.

3 comments:

Steven Satak said...

Nowhere do you tell us why this is important. Please, Stan, I don't even know what fMRI is.

Stan said...

fMRI is "functional Magnetic Resonance Imaging". Among other things, fMRI is used for brain scan analysis, done in order to determine thought processing by tracking changes in blood flow to parts of the brain which is processing specific tasks. Much philosophical theory of mind-as-brain is based on this type of analysis.

I don't know how this inaccuracy will affect those studies, because I don't know the ratio of the actual blood flow width (in voxels) to the distribution error (in voxels) in the statistical analysis program. If the ratio is high, then many of the studies might survive.

Stan said...

Another issue the study noted is that the larger studies tended NOT to release the data for outside scrutiny. This means that the empirical value is lost in terms of reputable objective knowledge.