Q&A – About the Election’s Exit Polls

Q&A of the Day – About the Election’s Exit Polls & How Accurate are Exit Polling Results? 

Each day I feature a listener question sent by one of these methods.       

Email: brianmudd@iheartmedia.com      

Social: @brianmuddradio     

iHeartRadio: Use the Talkback feature – the microphone button on our station’s page in the iHeart app.        

Today’s Entry: Hi Brian, After the election, there was much voter analysis. Our votes are not associated with us in any way, as far as I know. Yet the analysis provides insights such as "Voters without a college degree went for Trump by 12 points; college-educated voters went for Harris by 15 points. Harris’ showing among college-educated voters was 1 point worse than Biden’s 2020 showing among college-educated voters, while Trump bettered his 2020 numbers among noncollege voters by 4 points.” or "Catholic swing voters were critical to Donald Trump's blowout victory”. How is this analysis done and how much can we rely on it? 

Bottom Line: These are great questions that are worth diving into. To your point, researchers aren’t allowed to look at your ballot, see who it is who voted, and pull demographic information about you to include in post-election surveys (but boy would the political parties love to be able to do that if they could). The information about how voters voted, that you often hear recited following an election, is derived from exit polls. Exit polls are literally pollsters/researchers who ask permission of voters to provide demographic information about themselves along with how they voted, after they’ve voted. There are numerous organizations that conduct exit polls but most often there are collaborative efforts undertaken by news organizations to conduct surveys for presidential elections.  

For this year’s presidential election there were two major accredited exit poll surveys. One was conducted by Edison Research for ABC, CBS, CNN and NBC under the banner of the National Election Pool of which those news organizations are members. As is stated: The National Election Day exit poll is conducted at 279 polling places, 27 early in-person voting locations, and using an RBS (Registration Based Sample) multi-mode poll. Within each voting location an interviewer approaches every tenth voter as they exit the voting location. A target of approximately 75 voters complete a questionnaire at each voting location.  

In total that means an estimated 22,950 voters participated in the survey at polling locations across the country.  

The other was the Associated Press/National Opinion Research Center exit poll conducted for Fox News, PBS, The Wall Street Journal and The Associated Press. The methodology for this one is different. As opposed to leaning on in person surveys at polling locations across the country, as is stated: The (survey) takes interviews with a random sample of registered voters drawn from state voter files and combines them with interviews from self-identified registered voters selected using nonprobability approaches. It also includes interviews with self-identified registered voters conducted using NORC’s probability-based AmeriSpeak® panel, which is designed to be representative of the U.S. population. Many of the survey participants are compensated to participate. In total 110,000 voters were reached for the survey this year. So, about these surveys... 

Basically, it’s like this. It costs a lot of money to conduct comprehensive nationwide surveys. The media outlets behind these exit poll surveys all contribute to cover the costs and in return are able to use the results branding them as if they’re exclusive to the organization. For example, Fox News presents the information as “Fox News Voter Analysis”, while the Associated Press calls their exit poll service “AP VoteCast”. It’s all the same information. This is, by the way, the reason why almost all exit poll data sounds the same regardless of the news organization you’re hearing or reading it from. This reality, that it is all the same information (usually from no more than two different sources), also helps to reinforce the perceived accuracy of the results. If seemingly every news source you turn to is producing the same exit poll results...it must be true, right?  

These collaborative surveys stand in contrast to traditional polling in which all of the major news organizations conduct their own polls and may all produce different results even if conducted at the same time. And that takes us to the question of the overall accuracy of these results.  

So how accurate are these exit surveys really? That’s a good question. However, there’s a reason to believe they’re highly accurate. I’ll illustrate the point using traditional polls. Since these exit polls are nationwide surveys, the national popular vote is the best comparison.  

In Tuesday’s presidential election, the average of the accredited national pollsters was Harris +0.1%. Essentially the average pollster had the national popular vote projected as even. As of this this entry, Donald Trump leads by 3.3%, with about 40% of the California vote outstanding, meaning that it will probably come down a little bit. I’d estimate that Trump finishes with a win of just under 3%. So, in other words, the average pollster was off by about 3%. However, the size of the average polling sample was only 2,651 people. Consider this, the largest polling sample conducted by any national pollster was Rasmussen Reports with a sample of 12,546 likely voters in their final poll. What did their survey say? Trump +3 nationally. It looks like they nailed it.  

In accredited research, the larger the sample size, the more accurate the results are likely to be. That takes us back to the exit polls. The Edison in-person sample was nearly ten times larger than the sample for the average national political poll and double the size of the largest, and highly accurate political poll from this cycle. The AP/NORC exit poll had a sample that was over 41 times the size of the average national poll and nearly 9 times larger than the largest poll. This is to say that these large sample sizes lend additional credibility to the likelihood that the results are highly accurate. So, about that here’s what the survey’s said: 

Percentage that Trump won with the AP/NORC survey first & Edison second: 

  • Men: 54% - 55% 
  • Women: 46% - 45% 
  • White: 56% - 57% 
  • Black: 16% - 13% 
  • Hispanic: 42% - 46% 
  • No College Degree: 55% - 56% 
  • College degree: 42% - 42% 
  • Under 30: 46% - 43% 
  • Over 30: 51% - 51% 

Right down the line the results are almost identical with two exceptions – Black and Hispanic voters. Notably the NORC survey had Trump doing 3-points better with Black voters than Edison, however Edison had Trump doing 4-points better with Hispanic voters than the NORC survey. I know enough about demographic research to potentially explain what may have happened. There tends to be a cultural distinction with ethnic identity based upon one’s heritage. In other words, some people identify as Black when others may consider them to be Hispanic and vice-versa. That Trump’s level of minority support was essential equal between the two surveys is likely explained by that distinction.  

So, there you go. Probably everything you wanted to know about exit polls and then some. But yeah, as far as survey work goes, there’s good reason to believe they’re the most accurate polls that are conducted during any election cycle.  


View Full Site