We’re currently in the heat of the primary and caucus season, with a new poll -- or several polls -- released seemingly on a daily basis. But there’s a survey that’s been going out each month that most people haven’t heard of, and it may provide a peek into the future of polling.
In recent years, there’s been doubt about the long-term viability of traditional polling methods. Most public-opinion surveys are conducted by phone, with pollsters calling a large enough sample of the population to ensure a statistically valid survey. To do this, pollsters have had to expand their reach to cellphone users, which adds to the already considerable operational costs of survey work.
Pollsters have also had to grapple with the reality that many Americans no longer want to pick up calls from an unfamiliar phone number, much less spend 20 minutes sharing their personal opinions with a stranger (or a computer) on the other end of the line.
“With response rates for the best-designed traditional surveys now below 10 percent, everyone is looking for what is next,” said Karlyn Bowman, a polling analyst at the American Enterprise Institute.
Because of these trends, some polling has moved online. Still, this shift brings with it a series of other challenges, notably the fact that a fraction of Americans remain offline.
That’s where The American Panel Survey, or TAPS for short, comes in.
TAPS has been run out of Washington University in St. Louis since December 2011, making February’s survey the 50th version. Like some other surveys, TAPS is conducted online. But TAPS uses an unusual methodology.
Unlike most other online surveys, which find participants using “opt-in” methods such as advertising appeals, TAPS returns monthly to the same participants who have been recruited randomly. TAPS also solves the Internet connectivity issue by providing computers and web access to the 15 percent of participants who don't already have them.
Other pollsters are also trying the method of returning repeatedly to the same group of respondents, including the Pew Research Center’s American Trends Panel and the RAND Corporation's American Life Panel.
The TAPS panel at Washington University consists of 2,000 people, with 75 percent to 80 percent of them taking part in any given month. Participants are paid $10 a month. The panel was initially assembled through a random sampling of all United States addresses, using a technique originally designed by Knowledge Networks, now a subsidiary of GfK. About 10 percent of the pool drops out every year and is replenished using the same sampling procedure.
Because the initial selection process is random, rather than based on volunteers responding to an ad, “you impose representativeness in advance,” said Steven Smith, the director of Washington University's Weidenbaum Center on the Economy, Government and Public Policy, which runs TAPS.
Charles Franklin, the co-founder of Pollster.com and director of the Marquette Law School Poll, called TAPS “an admirable effort to combine random sampling with the efficiency of online data collection.”
So why are TAPS, and the Pew and RAND panel surveys, potentially revolutionary?
One big reason has to do with the fact that TAPS surveys the same people month after month. This makes it possible to track individuals’ changing opinions over time, something social scientists call “longitudinal” data.
“We are interested in what drives changes at level of the individual,” said Smith. “That can only be done in a panel study.”
An example is the following chart, which shows respondents’ preferences in the Republican presidential nominating process, as collected by TAPS between August and December. Using this chart and the underlying data, an academic researcher or journalist can track not just broad trends about which candidates were picking up or losing support, but more specifically how many voters were moving from, say, Ted Cruz to Donald Trump, or from Chris Christie to John Kasich, every month.
TAPS data can also explain what demographic factors underlie these changes in presidential preferences.
Another issue of great interest to polling professionals is how best to fine-tune the “likely voter” screen -- the technique by which pollsters try to predict who will actually vote -- so they can provide a better snapshot of the expected electorate when ballots are actually cast. To do this, pollsters ask respondents a battery of questions such as whether they are a registered voter and whether they voted in the most recent election. The more past political activity, the thinking goes, the likelier they are to vote in the future.
TAPS data showed in 2014 that the are-you-a-registered-voter question is the most important in the likely voter screen, according to Smith. Ongoing research this year will help determine whether that finding holds for presidential cycles, when turnout is higher, rather than just during midterm elections.
Eventually, researchers will be able to compare participants’ responses prior to the election against data on whether they actually voted from their secretary of state’s office. “This has only been in operation for three years, so we are just beginning to realize the full value of the accumulated data on specific individuals,” said Smith.
TAPS also sometimes looks at “lifestyle” data, such as movies watched or ice cream flavor favorites, in order to create a fuller portrait of what makes a Democrat or a Republican, or any number of other demographic groups.
So could this model reshape the future of political polling?
Not in the short term. Currently, participants have the whole month to answer, although roughly half respond within the first 48 hours. A month is far longer time in the field than the four to five days commonly used for traditional horse-race polls. Anything longer than that weakens the validity of the “snapshot” of voter preferences.
Traditional telephone polling, meanwhile, has remained robust for longer than some had imagined.
“For elections -- an outcome for which pollsters can be held immediately accountable -- [online polling] isn’t performing any better than, and arguably worse than, standard telephone calls," said Janine Parry, director of the Arkansas Poll. "Phone surveys are still doing pretty well overall.”
Over the long term, though, the online panel model shows potential.
Smith said technological convergence has begun to blur the lines between the phone and Internet already. Think of smartphones. Taking things a step further, online surveying allows pollsters to insert videos, such as campaign ads, into polls to gauge voter response. Testing such material -- once only possible using focus groups featuring in-person volunteers -- can provide a more subtle understanding than traditional polls that include new respondents every time.
“We think that probability-based Web panels are an important part of the present and future of U.S. survey research,” said Courtney Kennedy, director of survey research at the Pew Research Center. “They leverage new technology; they feature self-administration, which has measurement advantages; they avoid the expense of recruiting a new sample of adults for each survey; and they cover nearly all U.S. adults.”