3

Poll results that try to show a change (or lack thereof) in an opinion over time are usually reported similarly to this answer:

As of mid-December 2018, 59% of Republicans thought it was 'very likely' that Trump would get Mexico to pay for the wall (question 6A), and 9% believe he has 'already accomplished' this.

As a comparison, in April of 2017, only 46% of Republicans thought it was 'very likely' (question 4), 47% in November of 2017 (question 6), and in March of 2018, it was up to 54% (question 6A).

However, one theory I've seen for these numbers is that it's (roughly) the same quantity of people who would respond "Very likely", but the number of people who identify as Republicans has gone down, and thus that group represents a larger share of Republicans.

What would be a good way to report poll results which takes changing party affiliations (or other demographic changes) into account? Should all comparative poll results just include the percentages of the population that reported being affiliated with each party? Should poll results always provide total ("all American") numbers?

3
  • Polling is as much art as it is science. Without access to the raw data or context citing polling methods being the same, I throw very little weight towards comparing polls or meta-polls. This is due to how dramatically the wording of a question can influence the results. Unless questions are identical, they are not comparable. Here, they show the same phrase, but the context is different. The polls aren't the same throughout, and that also impacts answers. Anyone good at sales knows the flow of questions can change results.
    – David S
    Commented Jan 14, 2019 at 17:39
  • @DavidS The raw crosstabs and methods are linked from the answer I took that quote from. It's the same firm polling the same group in the same way, so it's about as comparable as it's possible to get. It's entirely valid to not trust the polls if you don't want to, but good pollsters are usually pretty accurate.
    – Bobson
    Commented Jan 14, 2019 at 17:58
  • 1
    I have narrowed down the scope to US because the post seems to refer to US only examples. Also, the only answer has US specific information, even if it might be relevant for other countries as well.
    – Alexei
    Commented Jun 14, 2019 at 7:49

1 Answer 1

2

What would be a good way to report poll results which takes changing party affiliations (or other demographic changes) into account? Should all comparative poll results just include the percentages of the population that reported being affiliated with each party?

The best way to report the results depends upon what the person reading them is interested in finding out. If you want to find out if support has changed from the nation as a whole, reporting as a share of all Americans makes sense. But, if your goal is, for example, to determine if Trump is likely to win a GOP primary election, the change in the percentage of people who identify as Republican may be irrelevant.

Should poll results always provide total ("all American") numbers?

The most credible polls make their full results and any relevant footnotes on margins of error in the main set and subsets and other important points of methodology, and not just a summary of their key results, available to interested readers.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .