Your organization launched a daring new product. It wants fast feedback from customers, so it signs with a company that does customer satisfaction surveys.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Quick survey: Would you trust the results?
It may seem that now — just weeks after political surveys and resulting analysis largely missed Donald Trump’s victory in the U.S. presidential election — is a bad time to ask. But with more organizations using data analysis to make critical business decisions, I’d argue it’s a great time to ask.
Harvard University political scientist Gary King said companies can learn a valuable lesson from what pollsters got wrong: If they want to buy services from a survey research outfit, they should ask tough questions about how it does its surveying.
“I would ask, ‘How do you know that you’re going to get a right answer?'” said King, who is also the director of Harvard’s Institute for Quantitative Social Science. “‘Give me some evidence that what you’re doing makes sense.'”
Beyond customer satisfaction surveying, survey research services include market research, to gather information about markets or customers, and new-product research, to determine whether a certain product or service will satisfy a certain need.
A serious problem with the political side of survey research today, King said, is this: Pollsters can’t easily get a representative sample of voters — that is, a small group of people who reflect the larger population. Lots of people use mobile phones only, and they often don’t want to respond to polls.
The nice thing about polling during elections is, after pollsters gather results of their survey — which point to one result or another — come Election Day, they learn the truth, King said.
“If the survey went wrong somehow or in some way small or large, then we get to know,” King said. “That’s actually a great thing for the survey research team because they get to learn something.”
Pollsters who’ve been proven wrong after the election — and that’s most of them — now have a great opportunity to improve their data collection and ultimately their forecasting prowess, King said.
Companies doing product or market research will have, he hopes, done the same: stumbled early on and then learned from their mistakes.
“There’s plenty of analytics where nobody learns anything; you just do it. And that’s not very good,” King said.
Rising use of mobile phones means pollsters have to rethink how to survey voters during elections. Read about it in this SearchCIO news story.