Why You Can’t Depend on Twitter Polls | TechTree.com

Why You Can’t Depend on Twitter Polls

Well, mostly -- except to prove what the pollster already believes

 
Why You Can’t Depend on Twitter Polls

Toward end-October, Twitter finally rolled out the ‘poll’ feature on its most popular platforms: the desktop Web, and iOSand Android.

So now you can ask your followers and their followers important stuff like who’ll win Bihar, or whether they’d rather date Justin Bieber or a troll, or whether New India’s symbol should be a spaceship or a cow.

How much can you depend on the Twitter poll for real insight, though?

The short answer is: not much.

The longer answer is, it depends on what you’re depending on it for, insight, or influence. If it’s to prove something you already believe to be true, yeah, that works. All you have to do it craft your question and response options in a way that will get most people picking the answer you want, and let selection bias do the rest.

What’s the selection bias here? That’s the big filter: you and your followers. They’re aligned to your thinking. If you’re into tech, or Hindutva, or medicine, you already have many followers with similar interests, and many with similar thinking.

Here’s a simple experiment. A senior journalist with center-right leanings and over 80,000 followers on Twitter posted this poll in early November: In your day-to-day affairs, have you experienced a rise in intolerance lately? The options: a. Yes, there’s venom, and b. None, whatsoever.

As you might expect, only 8% of 1,667 who voted said Yes, there’s venom. Why? One, because of the way it was worded. Indeed: in my day to day affairs, I’m less likely to have experienced intolerance, let alone ‘venom’. (Note also the extreme contrast between the two response options.) And finally, there’s the selection bias of the pollster’s followers being likely right-wing-inclined.

To test this out, I posted a similar, but simpler, question: Do you see rising intolerance in India? Yes or No. To dilute selection bias just a bit, I put in request to RT. Over 180 retweets(18,300 impressions) took it to a more diverse audience. And as against the 8% who said Yes in Abhijeet’s poll (which got 84 retweets), 44% of the 1,891 who voted on my poll said Yes, they saw rising intolerance.
 
 
The pattern was interesting. In the first 30 minutes, with 99 votes, 63% said Yes to rising intolerance. Those would be my immediate followers. As the poll was RT’d, diluting selection bias somewhat, the “Yes” responses dropped to 55%. With further RTs, especially from right-wing handles, the Yes responses dropped below 50% and finally settled at 44% when the poll closed 24 hours later.

 

No, I don’t believe even this 44-66 is necessarily a representative response for India.

The point here isn’t about intolerance, but about how unscientific online polls can be with the best of intentions. The answer to the same question can vary wildly based on who is asking, how the question and responses are phrased, and who widely it’s shared. Among other factors.

Someone qualified in market research can do things to reduce bias and get more dependable answers. Or, conversely, to manipulate those answers to ‘prove’ virtually anything.

You could argue that this is true of all polls, especially online ones, including SurveyMonkey polls. Well, Twitter makes it so much easier to poll. Expect to see an explosion of them, with everyone becoming a pollster and surveyor. Just as Twitter turned everyone into a journalist, killing mainstream media.

Oh wait. It didn’t.


Tags : Twitter, Twitter polls