• 13igTyme@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      2 days ago

      “Study using self-reported data shows that those more interested in politics are more likely to self-report data with post-election surveys. More at 11.”

      They literally say they are using self-reported post election surveys. Most people I know, including myself, have never done a post election survey. People that don’t vote also are not participating in post-election surveys. It’s an interesting study, but this is 100% textbook selection bias and I’m surprised Pew Research Center missed the mark on this one.

      If progressives voted in overwhelming numbers, then Bernie would have won the primary. I voted for Bernie, but clearly not many others did.

      • pjwestin@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        3
        ·
        2 days ago

        Not that I should even have to debate this, since my my source is the Pew Research Center and yours is, “most people I know,” but that’s a blatant misrepresentation of the methodology. The survey uses data from a group of randomly selected panelists, not self-reported post-election surveys.

        The American Trends Panel (ATP), created by Pew Research Center, is a nationally representative panel of randomly selected U.S. adults. Panelists participate via self-administered web surveys. …The ATP was created in 2014, with the first cohort of panelists invited to join the panel at the end of a large, national, landline and cellphone random-digit-dial survey that was conducted in both English and Spanish. Two additional recruitments were conducted using the same method in 2015 and 2017, respectively. Across these three surveys, a total of 19,718 adults were invited to join the ATP, of whom 9,942 (50%) agreed to participate.

        The only reference to self-reporting I found was people self-reporting whether or not they voted, and even then, that was independently verified. I’m pretty sure you clicked the first link you saw, scrolled down until you found this paragraph, and didn’t read it very carefully:

        Voter turnout and vote choice in the 2020 election is based on two different sources. First, self-reports of candidate choice were collected immediately after the general election in November 2020 (ATP W78). Secondly, ATP panelists were matched to commercial voter file databases to verify that they had indeed voted in the election. For more details, see “Behind Biden’s 2020 Victory.”

        Also, if Bernie’s failure to win the Democratic primary proves progressives don’t vote, then it stands to reason that Clinton and Harris’ defeat proves that moderates don’t vote either, right? I mean, it seems stupid to me to make broad, sweeping generalizations about voter behavior over something that has as many variables as an election, but if that’s what you want to do, then you must concede that Harris and Clinton prove that moderates don’t vote.

                  • pjwestin@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    1 day ago

                    Wow, great point! Except for two things; first, if people with strong political beliefs were more likely to reply to surveys, would that mean that basically every political survey was inherently biased? Second, if you look under the section of Appendix A marked, “incentives,” you can see that they corrected for sampling bias by offering higher incentives to groups that have a lower response rate:

                    All respondents were offered a post-paid incentive for their participation. Respondents could choose to receive the post-paid incentive in the form of a check or a gift code to Amazon.com or could choose to decline the incentive. Incentive amounts ranged from $5 to $20 depending on whether the respondent belongs to a part of the population that is harder or easier to reach. Differential incentive amounts were designed to increase panel survey participation among groups that traditionally have low survey response propensities.

                    Anyway, do want to keep trying to prove that the Pew Research Center doesn’t know how to conduct a survey, or are you finally tired of digging?