If there’s one thing everyone can agree on about American politics, it’s that we are extremely polarized. Red and blue are so diametrically opposed and hostile to one another that they might as well be different countries. Fewer voters are changing the party they vote for than ever before, especially in the last eight years.
But wait! If you are following the horse race coverage of the election, the one thing that everyone can agree on is that about 1 in 6 Americans are undecided or have weak vote preferences, driving the constant swings in each day’s new batch of polls, especially in the battleground states. (And even more so in “the demographic that will determine the winner!”— like noncollege voters, or whatever the conventional wisdom is fixated on at the moment.)
Both of these ideas cannot be true, and they aren’t. In reality, there are far fewer genuinely persuadable voters in America than there are survey respondents who say they are “undecided.” But we’re fooled by two things.
When taken literally, poll questions exaggerate how many voters are actually movable.
The first problem is statistical noise. Here’s an illustrative thought experiment. Let’s say we’ve been asking the same 2,000 people every week for the last year which presidential candidate they think they’ll vote for. You would expect to see almost no change week to week, and not much more change over a year or more. Indeed, that’s the result we get in the real world from panel studies that do exactly this. Yet if we ask a different set of 2,000 people every week, we will invariably see swings — not because individuals are changing their minds, but because we are asking different people.
The second problem is that when taken literally, poll questions exaggerate how many voters are actually movable. To survey respondents, choosing “undecided” doesn’t always mean what reporters writing about polls want it to mean — open to voting for Harris or Trump. Rather, many people say they’re “undecided” to express their ambivalence about the choice they will probably make in the end.
That’s because question writing is more difficult than it appears. What the words mean to the pollster are not always what the words mean to the survey taker. That’s why when I was political director at the AFL-CIO, we took cues from the social sciences and always experimented with different ways of asking the same question. That way, we could be sure that we were getting the responses to the question we intended. Unfortunately, most media polling analysts proceed with complete confidence that the responses are to the question the pollsters intended, or readers of survey results presume.
Consider Sara, a 24-year-old “undecided” survey respondent. Like most young people, she thinks the country is on the wrong track and resists identifying with either party. She doesn’t always go out of her way to vote. She had no enthusiasm whatsoever about President Joe Biden, and she isn’t sure yet how she feels about Vice President Kamala Harris. But since 2018, people like Sara have turned out in historic numbers that have made the difference between Democrats’ losses in 2016 and subsequent victories. The patterns of these turnout surges indicate that these voters are anti-MAGA, but not necessarily pro-Democrat. They are motivated by loss aversion — fearing (correctly) that they could lose their freedoms if MAGA wins. So, when Sara is asked which candidate she favors, she really doesn’t “favor” either of them. But if she votes, it won’t be for Trump.








