March 10, 2014

Most Published Research Is False


Over the years, this blog has been highly skeptical of marketing, advertising, and media research.

What passes for research in our world would be laughed out of most reputable scientific laboratories.
  • We almost never use controls
  • We almost never replicate our work 
  • We don't have peer review 
  • We don't have others see of they can reproduce our results.
There are so many ways for research to go wrong, that not bothering with any of these fundamental necessities of valid science creates enormous problems. Consequently, among sensible advertising people, there is widespread skepticism about research data and the interpretation of data.

Personally, the only thing I trust our researchers to do competently is to count. They can usually give us a pretty good idea of "how many." But asking them for a "why" or a "what" or a "how" is likely to get you an opinion masquerading as a fact.

The problem was further impressed on me recently when I read a piece by George Johnson science writer for The New York Times. Johnson writes about Dr. John P. A. Ioannidis, "a kind of meta-scientist who researches research."

Dr. Ioannidis wrote a paper in 2005 called “Why Most Published Research Findings Are False.” According to the article, "Dr. Ioannidis devised a mathematical model supporting the conclusion that most published findings are probably incorrect."

Now let's be clear.  Ioannidis is writing about real research, the kind that is done in biology and physics labs. Not the baloney that we call research.

Johnson also relates the story of the chief scientific officer of a pharmaceutical company who set about to reproduce the results of 53 "landmark papers about cancer." In 47 of the 53 cases he and his colleagues could not reproduce the results "even with the help of the original scientists working in their own labs."

Anyone who thinks cancer research is problematic but advertising research is reliable needs professional help.

By the way...
...recently a government study found that obesity among young children had plummeted 43% in the past ten years. I'd love to see Dr. Ioannidis get his hands on this baloney.



15 comments:

  1. The problem with a lot of research is that it seeks to *prove* instead of *discover.* Having a preconceived notion of something ruins the end result. "I'm going to prove that breathing causes cancer!" Followed by, "100% of cancer patients breathed their whole lives! I was right!"


    It's the same with creative. From the client to the brief to the creative directors, there's almost always a preconceived notion of what the end result should be. And that ruins it, 99% of the time.


    Free perspective (discovery) versus forced perspective (an agenda). Always battling.

    ReplyDelete
  2. You'll probably enjoy this video then...

    http://www.youtube.com/watch?v=vKA4w2O61Xo

    ReplyDelete
  3. Thanks Matt & Dan. Interestingly this experiment itself is an example of a finding that *is* generally replicable and robust: so confirmation bias may be true - and not a product of confirmation bias!

    ReplyDelete
  4. You may also enjoy this one:

    http://www.philosophyexperiments.com/wason/

    ReplyDelete
  5. All I ever really needed to know about advertising I learned from you... and from The West Wing... http://www.youtube.com/watch?v=MygelNl8fy4

    ReplyDelete
  6. Are you trying to infer that Michelle Obama's anti Obesity efforts are not wildly successful? That is sacrilegious talk in eco-correct, Donkey party loving, Northern California.

    ReplyDelete
  7. Agree, but then also the results from "you-wont-believe-how-big-tv-still"
    could be reviewed.
    I follow your reasoning and suggest: while figures acceptable (online small, TV huge) maybe there is some behaviour or factor that has not been accounted for nor included in analysis (just like your not reading the NY Times on Mondays and Tuesdays)?

    In this case could it not be "second screening"? see today's WARC bulletin http://po.st/lTaeuD; again if we accept sheer measurement we can accept the generic conclusion that people do switch devices (and my personal observation is that it is cross-generational behaviour).

    So (you guys still there? ;-): could it not be an emerging pattern that people sitting and watching TV have their smartphones or tablets at reach to follow up and expand on any interesting content they have seen on TV?

    ReplyDelete
  8. It is as if they set out to make us look stupid.

    ReplyDelete
  9. As a 20-year veteran market researcher, I have to say--I'm happy to read that you are willing to fund controls, replicated studies, peer reviews, and parallel studies to gauge reproducibility! That's wonderful news--I can't wait to send you the bill.

    Let's be honest here. If you are going to hold brand/advertising research to the standards you set in your bullet-pointed list here, someone has to pay for it. You've beaten this poor straw man to shreds.

    ReplyDelete
  10. As I've said in other posts: "Now, I’m not here to pick on market researchers. Our lack of rigor is not their fault. We simply don’t have the time, money, or inclination todo the type of rigorous experimentation that academics and scientists and some industries do before they can say they really know something."

    Which does not in any way change the fact that a great deal of what we call research is baloney.

    ReplyDelete
  11. I absolutely do not dispute that much of what we call research is baloney. I see heaping helpings of that baloney published on the Internet everyday. Research for the purposes of content creation is *especially* incurious.

    But I think the best of us can do more than just count competently. I'm reminded of a particularly cantankerous English prof I had as an undergrad, who once told us that it was ironic that he was in the business he was, because he got into the business for the love of words--and yet he had to read so many vile papers. It was like, he opined, a wine taster who was fed naught but vinegar.

    I do enjoy the occasional sip of wine. It's also my job to do the best I can with the vinegar.

    ReplyDelete
  12. Trouble is fella, many, many market research firms sell their services as "hard science" to unsuspecting clients.


    I have seen the most incredibly bogus methodologies (including one that claimed to have proven which brand archetype was most likely to drive loyalty and growth in a particular area of financial services sector) sold as "science".


    And time and time again we've all witnessed clients make multi-million dollar decisions based on complete BS.


    "Complicated looks clever to stupid people" (Dave Trott) and this is something market research people use to baffle research-naive clients ALL THE TIME.


    There is a massive lack of ethics and rigour in the market research industry. Not saying you're in this boat, mate. But 1000s of your brethren are.

    ReplyDelete
  13. And even in scholarly, peer-reviewed studies there exists this problem that calls all scientific research into question.


    A nasty little problem that won't go away. A problem many call The Decline Effect.

    http://www.newyorker.com/reporting/2010/12/13/101213fa_fact_lehrer



    Without reliability (the ability to replicate results), validity (the demonstration of significant correlative effect) returns to hunches.

    ReplyDelete
  14. Nobel Prize winner Daniel Kahneman has called for a similar need for replication for psychology studies.

    http://www.nature.com/news/nobel-laureate-challenges-psychologists-to-clean-up-their-act-1.11535

    ReplyDelete
  15. I think Dr. Ioannidis should keep his hands off these "heaping helpings" of baloney, else he might end up obese as well.

    ReplyDelete