NewsNational

Actions

How accurate will midterm polling be?

Posted
and last updated

The polling industry has a lot on the line heading into Tuesday's midterm election.

Critics blamed pollsters when voters were caught off guard by Donald Trump's election in 2016. Old cries of "don't believe the polls" became fevered shouts. And the president has encouraged distrust by calling certain polls "fake" and claiming they are used to "suppress" the vote.

Although there is no evidence to suggest that is true, there is persistent and widespread suspicion about polling, according to, you guessed it, a McClatchy-Marist poll. And it exists on both sides, albeit in different forms.

"I think Democrats may have felt let down by the polls but don't think it was an intentional error. I think many Republicans believe the polling errors of 2016 were intentional," GOP pollster and co-founder of Echelon Insights Kristen Soltis Anderson told CNN.

So can the industry regain trust?

Since 2016 there's been a whole lot of self-reflection in the polling world. Pollsters have tweaked their techniques; pundits have become more cautious when talking about polls; and news outlets have conducted some fascinating experiments.

On Tuesday, all the efforts are being put to the test.

"Some pollsters would disagree with this, but the way that the public generally views whether or not polling is accurate is whether or not it gets the results of the election right," CNN analyst Harry Enten said on "Reliable Sources."

"I'm not necessarily sure that's fair," Enten said, "but I do think that there is more pressure on pollsters this year to get it right given the president's rhetoric and given what happened in 2016."

Many, though not all, 2016 polls underestimated support for Trump. This effect was particularly pronounced at the state level, where there were embarrassing "misses," showing Hillary Clinton with safe leads in states Trump actually carried.

Most national polls accurately showed Clinton winning the popular vote. But reporters and commentators made lots of mistakes in their interpretations of the polls. Readers and viewers did, too. Many people discounted the margin and other factors and made faulty assumptions that Trump would lose to Clinton.

There were other problems, too. Predictive features on websites gained lots of traffic before the election but caused lots of consternation afterward. HuffPost's model infamously showed Clinton with a 98 percent chance of winning. "We blew it," the site admitted afterward.

But just as importantly, HuffPost's Natalie Jackson tried to explain why.

Other news outlets have also tried to be more transparent and remind voters of what polls cannot convey.

In special elections since 2016, Democrats have repeatedly outperformed polls of their races.

The top example was the Virginia governors' race. "Ralph Northam was favored by three points. He ended up winning by nine," Enten said.

But past outcomes are not an indicator of future results.

"I think many pollsters and forecasters have tried to be much more intentional about explaining uncertainty and being humble about what data can and can't tell us," Anderson said. "Because I think there was a big sense that in 2016, there was more certainty conveyed than may have been justified by the available data."

So political pros and reporters are communicating poll results differently this time. Time magazine's Molly Ball, who has a no-predictions rule for herself, said that even people who do make predictions are adding more caveats: There's "less of the, 'Well, the needle shows this' and more of, 'Here's what it doesn't show, here's what we should always remember can happen about probabilities.'"

Early voting has been explosive in the midterms, indicating above-average enthusiasm among both Democrats and Republicans. Pollsters have to make assumptions about turnout when contacting "likely voters," and this is a difficult election to forecast.

The 2018 electorate is "a universe that doesn't exist yet," Democratic pollster Margie Omero said. "I mean, people don't know whether they're going to vote, some people."

They may tell a pollster that they're sure to vote, but never make it to the ballot box. Or they might change who they're voting for.

Conversely, certain subsets of voters may have a big impact on the final results without really showing up in the pre-election polling. If pollsters assume relatively low youth turnout, but lots of young people vote for the first time, that could cause big surprises in certain races.

The vast majority of people who are called by pollsters decline to participate, so the researchers have to make a huge number of phone calls, bend over backwards to reach a representative sample of people, and weight their results accordingly.

Some polls are higher quality than others. Most news outlets tend to favor live interviewers, as opposed to computerized systems, and a mix of landline and cell phone calls. But some outlets are wading into web-based polling. CNN's polling standards preclude reporting on web polls.

This fall The New York Times pulled back the curtain by conducting "live polling" and publishing the results in real time, call by call. Working with Siena College, the surveyors made 2,822,889 calls and completed 96 polls of House and Senate races.

"We wanted to demystify polling for people," said Nate Cohn of The Times' Upshot blog.

"From our point of view, it's almost a miracle how accurate polls usually are, given all the challenges," Cohn said in an interview with CNN.

He emphasized that polls are "very fuzzy things." And the real-time polling showed this to the public. The researchers sought to interview about 500 people for each race that was examined.

In Iowa's fourth congressional district, for example, 14,636 calls resulted in 423 interviews.

The results showed the incumbent, far-right congressman Steve King, with 47% support, and his Democratic challenger J.D. Scholten with 42%.

The Times characterized this as a "slight edge" for King, with lots of room for error. "The margin of sampling error on the overall lead is 10 points, roughly twice as large as the margin for a single candidate's vote share," the Times explained on its website.

Cohn's final pre-election story noted that "even modest late shifts among undecided voters or a slightly unexpected turnout could significantly affect results."

That's the kind of language that lots of polling experts are incorporating into their stories and live shots, especially in the wake of the 2016 election.

"With polling, you never actually get to the truth," Cohn said. "You inch towards it, and you think you end up within plus or minus 5 points of it at the end."

As Enten put it, "polls are tools," not meant to be perfect. But that message needs to be reinforced through the news media.