June 28 (Reuters) - A group of U.S. voters who were unable to choose between Joe Biden and Donald Trump before Thursday’s presidential debate delivered their verdicts after the contest and it was almost universally bad news for Biden.
Of the 13 “undecideds” who spoke to Reuters, 10 described the 81-year-old Democratic president’s performance against Republican candidate Trump collectively as feeble, befuddled, embarrassing and difficult to watch.
You don’t publish initial results without a significant population sample. 13 people is an abysmal sample size. You need around 10% of a population polled up until about 1,000 people because of the way the curve levels out. 100 people minimum to get something remotely confident. The confidence level of this poll is so low that the publishing of it is irresponsible and unethical.
To your argument about the other poll having only 8, that’s also irresponsible. Both articles are clearly jumping to conclusions in an effort to grab views. However, that it received a more positive response is clearly indicative of the way the lemmy population leans. That’s really about all you can grab from that… Well, that and people have no idea how statistical averaging works.
Again, you don’t seem to understand the intent of focus groups or why they’re used by political campaigns. In a way focus groups are more akin to Case Studies, which are still extremely insightful.
Besides, we already have a broader set of polling data of battleground states, and what we see here is a reflection of those wider, scientific polls that didn’t bode well for Joe Biden even pre-Debate.
The mere fact that ANY random sample of undecided voters came away with these views, is downright dangerous.
Oh no, I very much do. I have a degree in psychology that requires being able to do statistical analysis for research.
You use a focus group to elicit qualitative, not quantitative, info from a targeted group in a study, not as the study itself. The issue is, it’s not meant for broad populations or for quantitative studies. Even then, the data is easily skewed by biases from the group themselves, the moderator, and the interpreter and shouldn’t be the only thing used.
Focus groups are meant for things like quality indicators, where you use a range of them in general analysis, which can help to triangulate where an issue is.
To properly employ a focus group, you would first need to poll an appropriate sample size of undecided voters then you target demographics within the sample to gain insight into why they answered their poll as they did.
And how, qualitatively, did these focus groups triangulate where undecided voters are on the issue of who to vote for?
Isn’t it quite probable they did exactly this? They certainly didn’t just pull these people off the streets. They had to aggregate undecided voters to begin with, after all.
I think it’s reaching for straws to suggest this isn’t saying what we already recognize from polling conducted in battleground states.
Edit:
Then they need to state it, because the only data they’ve given is that they asked a group of 13 people, one group, which is still not an adequate sample. Period.
That, right there, is why focus groups shouldn’t be used for this to generalize a larger population, because the data is being misinterpreted to sell a biased story! Probability would be estimated if they actually conducted a full study. Which they clearly didn’t.
And you can’t use previously gathered data from battleground states to estimate results after an event. They’re snapshots of an opinion at that given time. You can’t use them for an event that occurred after the fact. Again, that’s unethical and inappropriate.
The data wasn’t good before, and it doesn’t take a statistician to know they’re going to be as-bad or worse than before post-debate. I’ll happily take that bet with you and circle back in the coming weeks as state-wide polling proves this.