Apparently Gene Glass invented 1 meta-analysis because he wanted to prove someone wrong:
Footnotes:…in the summer of 1974, I set about to do battle with Hans Eysenck and prove that psychotherapy – my psychotherapy – was an effective treatment. I joined the battle with Eysenck’s (1965) review of the psychotherapy outcome literature. Eysenck began his famous reviews by eliminating from consideration all theses, dissertations, project reports or other contemptible items not published in peer-reviewed journals. This arbitrary exclusion of literally hundreds of evaluations of therapy outcomes was indefensible to my mind. It’s one thing to believe that peer review guarantees truth; it is quite another to believe that all truth appears in peer-reviewed journals.
Next, Eysenck eliminated any experiment that did not include an untreated control group. This makes no sense whatever, because head-to-head comparisons of two different types of psychotherapy contribute a great deal to our knowledge of psychotherapy effects…
Having winnowed a huge literature down to 11 studies (!) by whim and prejudice, Eysenck proceeded to describe their findings solely in terms of whether or not statistical significance was reached at the .05 level…
Finally, Eysenck did something truly staggering in its illogic. If a study showed significant differences favoring therapy over control on what he regarded as a ‘subjective’ measure of outcome (e.g., the Rorschach or the Thematic Apperception Test), he discounted the findings entirely. So be it; he may be a tough judge, but that’s his right. But then, when encountering a study that showed differences on an ‘objective’ outcome measure (e.g., grade-point average) but no differences on a subjective measure (such as the Thematic Apperception Test), Eysenck discounted the entire study because the outcome differences were ‘inconsistent’.
Looking back on it, I can almost credit Eysenck with the invention of meta-analysis by anti-thesis. By doing everything in the opposite way that he did, one would have been led straight to meta-analysis. Adopt an a posteriori attitude toward including studies in a synthesis, replace statistical significance by measures of strength of relationship or effect, and view the entire task of integration as a problem in data analysis where ‘studies’ are quantified and the resulting database subjected to statistical analysis, and meta-analysis assumes its first formulation. Thank you, Professor Eysenck.
…[Our] first meta-analysis of the psychotherapy outcome research finished in 1974-1975 found that the typical therapy trial raised the treatment group to a level about two-thirds of a standard deviation on average above the average of untreated controls…
…[Researchers’] reactions [to the meta-analysis] foreshadowed the eventual reception of the work among psychologists. Some said that the work was revolutionary and proved what they had known all along; others said it was wrongheaded and meaningless. The widest publication of the work came in 1977, in an article by Mary Lee Smith and myself in the American Psychologist. Eysenck responded to the article by calling it ‘mega-silliness’…
- Yes, I’m simplifying. Others deserve credit for their contributions, and some meta-analytic ideas were “in the air” when Glass was spurred by Eysenck to do a fairer sort of literature review.[↩]