Google

YouTube

Spotify

Scientific Sense Podcast

Saturday, September 13, 2014

Analytical dogma

A recent study published in the Journal of American Medical Association by Stanford researchers showcases an amazing statistic – In the last 3 decades of clinical research, there has been only 37 published re-analysis of randomized clinical trial data. Of this sample, over one-third came to conclusions that differed from the original analysis. An insular culture, unaware of accelerating technologies elsewhere, has been left behind with antiquated tools and techniques. And, resistance to re-analysis is only one of the symptoms of a prevalent disease in the industry, that sticks to dogma and tradition. A regulatory regime that aids such behavior is not helpful either.

It is instructive to note that in the studies that showed significant deviations from original conclusions in the reanalysis of the data, the researchers used different statistical and analytical methods. Even changes in hypothesis formation and the handling of missing data seem to have made a difference to the conclusions. Furthermore, the new studies discovered common errors in the original publications. Definition and measurement of risk are important determinants of eventual conclusions – and in many industries, the measurement and control of risk have substantially progressed to higher levels of sophistication. The researchers point out that sharing of the data and the use of people and analytical techniques from other domains may overturn many of the conclusions, currently held sacrosanct.

Scientists, departing from the spirit of science, in which sharing of data, knowledge and techniques across experts and domains are the norm, do damage not only to themselves but also to the industry. Regulators, steeped in methodologies and SOPs that are antiquated and irrelevant, just aid persistent incompetence.

No comments:

Post a Comment