Automated polling is gaining popularity as a much lower cost alternative to live-interviewer polls. But it’s young enough that many in the industry are still uncertain about the reliability of these polls. How do you know that you reached the right person, and not a child? Are respondents just mashing buttons?
The data released on this topic (especially in the wake of the 2008 election) suggests that in practice, automated polling is just as accurate at predicting election results as live-interviewer polls (and in many cases, more accurate). This methodology is well on its way to acceptance by mainstream research and media outlets.
Here are some resources on this topic:
1. AAPOR Report
Probably the most authoritative source is a study of the methodology of primary polls in 2008 by the American Association for Public Opinion Research in which they state:
All of the final pre-primary polls were conducted by telephone, using either CATI or IVR systems. We found no evidence that one approach consistently out-performed the other – that is, the polls using CATI or IVR were about equally accurate. – PAGE 30
The use of either computerized telephone interviewing (CATI) techniques or interactive voice response (IVR) techniques made no difference to the accuracy of estimates. – PAGE 77
2. Mark Blumental
Mark is an editor at Pollster.com who regularly covers automated polls. A definitive article is:
In Defense of Automated Surveys, September 2009
He also wrote this report for Public Opinion Quarterly. It includes a discussion of IVR with several examples and quotes supporting it.
Here’s a blurb from one of his other articles:
As PPP’s Tom Jensen noted earlier this week, analyses conducted by the National Council on Public Polls (in 2004), AAPOR’s Ad Hoc Committee on Presidential Primary Polling (2008), and the Wall Street Journal’s Carl Bialik all found that automated polls performed about as well as live interviewer surveys in terms of their final poll accuracy. To that list I can add two papers presented at last week’s AAPOR conference (one by Harvard’s Chase Harrison and Farleigh Dickinson Unversity’s Krista Jenkins and Peter Woolley) and papers on prior conferences on poll conducted from 2002 to 2006 (by Joel Bloom and Charles Franklin and yours truly). All of these assessed poll conducted in the final weeks or months of the campaign and saw no significant difference between automated and live interviewer polls in terms of their accuracy.
He also reports on their gaining prominence back in 2006:
He’s not kidding. Of the 1,031 poll results logged into the Pollster.com database so far in the 2006 cycle from statewide races for Senate and Governor, more than half (55%) have been done by automated pollsters Rasmussen Reports, SurveyUSA or over the Internet by Zogby International. And that does not count the surveys conducted once a month by SurveyUSA in all 50 states (450 so far this year alone). Nor does it count the automated surveys recently conducted in 30 congressional districts by Constituent Dynamics and RT Strategies.
3. Wall Street Journal
The Journal wrote an excellent article on the use of automated polling for the 2008 presidential campaign, Press 1 for McCain, 2 for Obama (Aug 2008). They discuss the criticisms of automated polling, but also bring up the strengths that put it on equal footing with live interviews.
Recorded polls, however, offer several advantages… Politicians’ names are pronounced correctly and identically each time, and responses entered correctly are recorded correctly.
There also is evidence that automated polls inspire honesty, particularly on sensitive topics. Stephen Blumberg, who conducts polls for the Centers for Disease Control and Prevention, says that in tests, people responding with touch tones instead of by voice were more likely to admit they had multiple sex partners, or traded sex for money or drugs.
See the author’s companion blog post.
4. Business Wire
In this November 2004 article, they find an IVR pollster (Survey USA) as having the highest accuracy of 104 polling firms.