I’ll admit it; I am rapidly becoming a skeptic when it comes to interview-based data. And the reason is that people (interviewees) just don’t know their business – although, of course, they think they do.
For example, in an intriguing research project with my (rather exceptional) PhD student Amandine Ody, we asked lots of people in the Champagne industry whether different Champagne houses paid different prices for a kilogram of their raw material: grapes. The answer was unanimously and unambiguously “no”; everybody pays more or less the same price. But when we looked at the actual data (which are opaque at first sight and pretty hard to get), the price differences appeared huge: some paid 6 euros for a kilogram, others 8, and yet other 10 or even 12. Thinking it might be the (poor) quality of the data, we obtained a large sample of similar data from a different source: supplier contracts. Which showed exactly the same thing. But the people within the business really did not know; they thought everybody was paying about the same price. They were wrong.
Then Amandine asked them which houses supplied Champagne for supermarket brands (a practice many in the industry thoroughly detest, but it is very difficult to observe who is hiding behind those supermarket labels). They mentioned a bunch of houses, both in terms of the type of houses and specific named ones, who they “were sure were behind it”. And they quite invariably were completely wrong. Using a clever but painstaking method, Amandine deduced who was really supplying the Champagne to the supermarkets, and she found out it was not the usual suspects. In fact, the houses that did it were exactly the ones no-one suspected, and the houses everyone thought were doing it were as innocent as a newborn baby. They were – again – dead wrong.
And this is not the only context and project where I have had such experiences, i.e. it is not just a French thing. With a colleague at University College London – Mihaela Stan – we analyzed the British IVF industry. One prominent practice in this industry is the role of a so-called integrator; one medical professional who is always “the face” towards the patient, i.e. a patient is always dealing with one and the same doctor or nurse, and not a different one very time the treatment is in a different stage. All interviewees told us that this really had no substance; it was just a way of comforting the patient. However, when we analyzed the practice’s actual influence – together with my good friend and colleague Phanish Puranam – we quickly discovered that the use of such an integrator had a very real impact on the efficacy of the IVF process; women simply had a substantially higher probability of getting pregnant when such an integrator, who coordinates across the various stages of the IVF cycle, was used. But the interviewees had no clue about the actual effects of the practice.*
My examples are just conjectures, but there is also some serious research on the topic. Olav Sorenson and David Waguespack published a study on film distributors in which they showed that these distributors’ beliefs about what would make a film a success were plain wrong (they just made them come true by assigning them more resources based on this belief). John Mezias and Bill Starbuck published several articles in which they showed how people do not even know basic facts about their own companies, such as the sales of their own business unit, error rates, or quality indicators. People more often than not were several hundreds of percentages of the mark, when asked to report a number.
Of course interviews can sometimes be interesting; you can ask people about their perceptions, why they think they are doing something, and how they think things work. Just don’t make the mistake of believing them.
Much the same is true for the use of questionnaires. They are often used to ask for basic facts and assessments: e.g. “how big is your company”, “how good are you at practice X”, and so on. Sheer nonsense is the most likely result. People do not know their business, both in terms of the simple facts and in terms of the complex processes that lead to success or failure. Therefore, do yourself (and us) a favor: don’t ask; get the facts.
* Although this was not necessarily a “direct effect”; the impact of the practice is more subtle than that.