Quiz: A Brief History of Polling and Political Measurement
Questions
1. The earliest recorded straw poll in American politics was conducted in which year? - (a) 1789 - (b) 1824 - (c) 1860 - (d) 1916
2. The Literary Digest poll of 1936 surveyed approximately how many people? - (a) 50,000 - (b) 500,000 - (c) 2.4 million - (d) 10 million
3. What was the primary source of error in the 1936 Literary Digest poll? - (a) The questions were poorly worded - (b) The sampling frame overrepresented affluent Americans - (c) The poll was conducted too early - (d) The poll used probability sampling incorrectly
4. George Gallup's key insight in 1936 was that: - (a) Larger samples always produce better results - (b) Mail surveys are more accurate than phone surveys - (c) Representativeness matters more than sample size - (d) Polling should only be done in the final week of a campaign
5. The method used by Gallup and other early scientific pollsters, in which interviewers filled demographic quotas using their own judgment, is called: - (a) Probability sampling - (b) Stratified random sampling - (c) Quota sampling - (d) Snowball sampling
6. In the 1948 election, all major pollsters predicted that __ would win, but ____ actually won. - (a) Roosevelt; Landon - (b) Dewey; Truman - (c) Truman; Dewey - (d) Eisenhower; Stevenson
7. Which of the following was NOT identified as a cause of the 1948 polling failure? - (a) Interviewers selected unrepresentative respondents within quotas - (b) Pollsters stopped surveying too early - (c) The sample size was too small - (d) Undecided voters were not allocated to candidates
8. Random digit dialing (RDD) is associated with which type of sampling? - (a) Quota sampling - (b) Convenience sampling - (c) Probability sampling - (d) Snowball sampling
9. According to the chapter, telephone polling response rates fell from approximately 36% in 1997 to less than what percentage by 2024? - (a) 20% - (b) 12% - (c) 8% - (d) 4%
10. Which of the following is cited as a major cause of the decline in telephone polling response rates? - (a) People no longer have opinions about politics - (b) Caller ID, robocalls, and declining institutional trust - (c) The elimination of landline telephones by law - (d) Pollsters began asking too many questions
11. The primary methodological concern with non-probability online panels is: - (a) They are too expensive to operate - (b) They cannot include visual elements in surveys - (c) Panelists are self-selected and may not be representative - (d) They can only be used for market research, not political polling
12. Meridian Research Group's hybrid polling approach includes all of the following EXCEPT: - (a) Live telephone interviews with landline and cell phone numbers - (b) Text-to-web surveys sent to randomly selected registered voters - (c) In-person door-to-door interviews - (d) Online panel respondents from a probability-based panel
13. After the 2016 polling errors, what methodological adjustment did many pollsters adopt? - (a) Switching entirely to online panels - (b) Weighting by education level - (c) Eliminating cell phone samples - (d) Reducing sample sizes
14. Vivian Park's "Worry List" is best described as: - (a) A list of competitors who might take Meridian's clients - (b) A running catalog of potential sources of error in the firm's methodology - (c) A record of polls that Meridian got wrong - (d) A list of survey questions that are too sensitive to ask
15. The chapter's central theme, "Measurement Shapes Reality," means: - (a) Polls always change the outcome of elections - (b) The methods used to measure opinion determine whose opinions are counted and how they are represented - (c) Better measurement technology guarantees better predictions - (d) Political reality can be understood only through quantitative measurement
Answer Key
-
(b) 1824. The Harrisburg Pennsylvanian reported candidate preferences at a public gathering in Wilmington, Delaware.
-
(c) 2.4 million. The Digest mailed out approximately 10 million postcards, and 2.4 million were returned.
-
(b) The sampling frame overrepresented affluent Americans. The Digest drew names from telephone directories and automobile registrations, which in 1936 were markers of prosperity that correlated with Republican preference.
-
(c) Representativeness matters more than sample size. Gallup correctly predicted Roosevelt's win with a sample of only a few thousand, compared to the Digest's 2.4 million.
-
(c) Quota sampling. In this method, interviewers fill specified demographic quotas but have discretion in selecting individual respondents within each quota.
-
(b) Dewey; Truman. All major pollsters predicted Dewey would win. The Chicago Tribune famously printed the headline "DEWEY DEFEATS TRUMAN."
-
(c) The sample size was too small. The other three factors---interviewer selection bias within quotas, early cessation of polling, and failure to allocate undecided voters---were all identified as causes. Sample size was adequate.
-
(c) Probability sampling. RDD generates random telephone numbers, giving every household with a telephone a known probability of selection.
-
(d) 4%. The chapter states response rates fell to "less than 4 percent" by 2024.
-
(b) Caller ID, robocalls, and declining institutional trust. The chapter identifies multiple reinforcing causes, including technology that enables call screening, the explosion of robocalls and scams, and declining trust in institutions.
-
(c) Panelists are self-selected and may not be representative. Non-probability panels recruit through convenience methods, meaning participants volunteer rather than being randomly selected.
-
(c) In-person door-to-door interviews. Meridian's hybrid approach combines live telephone interviews, text-to-web surveys, and probability-based online panel respondents. In-person interviews are not part of their standard methodology.
-
(b) Weighting by education level. The AAPOR post-mortem identified education-based nonresponse bias as a key factor in 2016 errors, and many pollsters subsequently began weighting by education.
-
(b) A running catalog of potential sources of error in the firm's methodology. Vivian updates it after every election cycle to track new and persistent sources of potential bias.
-
(b) The methods used to measure opinion determine whose opinions are counted and how they are represented. The chapter traces how each era's technology---straw polls, telephone surveys, online panels---shaped which voices were included in "public opinion."