June 11, 2014

Fact-Free Research


While I am fond of calling advertising pundits, trade press reporters, and marketing gurus morons, nitwits, and cement-heads, the truth is many of them are pretty smart.

So how can it be that they have been so wrong about so many things over the past 10 years? For example, we've been told for 10 years that TV was dying when in fact viewing has reached record highs. How can this be?

Part of the problem is "narratives" -- the stories that cultures spawn. Once a narrative is born, it is very hard to kill.

Equally important is our naivete about what we call "research."

Last week I wrote about the lack of understanding of mathematics that plagues our industry. This is also true of our deficiencies in understanding research.

In the hard sciences, research is reasonably reliable because they measure things. In the soft (social) sciences, research is often not about measuring things, but about asking questions.

One of the most unreliable practices of our marketing "researchers" is to ask people questions instead of measuring their behavior. In other words, rather than watching to see if you're cheating on your girlfirend, they ask you if you are. Then they treat your answer as a fact rather than just the bullshit it is.

The consequence of this is that a great many of the surveys, reports, and studies we read tell us nothing about what we're trying to understand, they tell us what people say about what we're trying to understand. A very different thing.

Here's an example:

A recent article in Ad Age on loyalty programs, reported that...
...The number spikes to 37% when it comes to millennials surveyed for the study, who said they would not be loyal to a brand that doesn't have a strong loyalty program...
According to the study, 68% change when and where they make purchases to get loyalty rewards, and 60% will switch brands if incentivized.
They use numbers and percents to pretend they have facts. There isn't a fact in sight. All they have is what people say they do. There is no more unreliable way to ascertain what people actually do than to ask them.

Like this...

A couple of years, ago Forbes ran an article with this headline: CES: Survey Finds Traditional TV Viewing Is Collapsing. 

The "research" was done by Accenture, the consulting company. Listen to this frenzied nonsense from the report:
"...the number of consumers who watch broadcast or cable television in a typical week plunged to 48% in 2011 from 71% in 2009
Those are absolutely stunning results, which is (sic) accurate suggest that consumer behavior on television watching is changing faster than anyone had expected."
Accenture’s explanation for the trend is that the TV is losing ground to other devices – mobile phones, laptops and tablets..."
All this hysteria was based on asking people questions, not measuring their behavior.

Fortunately, someone was actually measuring behavior during this period, so we can see how wrong the self-reported baloney was (click chart to enlarge.)

According to Nielsen's Cross-Platform report (Q3, 2013) TV viewing during the period of Accenture's "collapse" didn't change at all. The only thing that changed were the answers that people gave to Accenture's annoying survey takers.

Accenture's "absolutely stunning results" were stunning all right. Stunningly wrong.