From: <REDACTED>, Telstra Social Media Market Research
It would be nice to report the unequivocal success of our first test of Twitter as a market research tool, but for reasons outlined below, I am afraid I cannot.
To recap the high points of the campaign, the strategy was as follows:
Phase 1: Leak story to media
Outcome: Success Once media reports emerged that Telstra intends to trial various traffic management techniques including (possibly) Deep Packet Inspection to “improve the customer experience”, the story was reposted many hundreds of times on Twitter.
This is a far more effective strategy than releasing similar information using an official Telstra account as the first source of information. For some reason, people who believe in the obsolescence of mainstream media still exhibit a remarkable dependency on these outlets as their primary sources.
Phase 2: Troll social media with vague official response
Outcome: Success Twitter users’ dissatisfaction with Telstra’s official response was quite gratifying, since they helped spread the story even further than its original footprint to tell each other that we were not taking their concerns seriously.
The multiplier effect of the “fake response strategy” is noted and will be refined in future campaigns.
A special mention goes to <REDACTED> whose brilliant suggestion left our statement sufficiently vague that nobody can tell which network (ADSL, cable, 3G, 4G) will be used in the test. This allowed plenty of latitude for further argument about the nature and purpose of the test.
Phase 3: Analysis of responses
Outcome: Mixed While the exercise provided a great deal of data, as you can see in the chart below, much of the data had to be discarded:
While eliminating bots (including our own) was easy, a strong favourable response in our initial analysis turned out to be a false positive, since the hashtag #DPI overlapped with a discussion of a Department of Primary Industries educational initiative.
We were thus disappointed to find that once we corrected the data for this mistake, only 2 percent of responses justified analysis. At this point, it became easier to skip the “Big Data” approach and use a spreadsheet for analysis:
Deep Packet Inspection is a “hot button” topic for anybody operating an “Anonymous” account, which meant a number of “serious” responses to the story are of no real value. We already know what Anonymous thinks of us, and most operators of such accounts are not located in Australia.
Of the remainder, 28 percent were already known as accounts that “hate Telstra” – Whirlpool users, our competitors and so on.
Pickup by the indefatigable Malcolm Turnbull might have been useful, except that his own followers generate a predictable pattern of discussion, mostly from known individuals.
The number of Telstra staff and consultants responding on their own time would be gratifying, except that it also resulted in a significant number of responses having to be disregarded. We should improve our internal communications before attempting similar exercises in the future.
This left a mere five percent of responses providing useful information: Telstra customers, non-Telstra customers, “bush lawyers” (by which I mean people who consider their grasp of the legality of DPI as better than our internal legal team), those who don’t understand what’s going on, and experts.
Regrettably, we were unable to find any supporters of the original “proposal”.
The most important data point, it appears to me, is the very low proportion of clueless users. This suggests that if we are proposing an initiative that’s likely to be unpopular, a great deal more obfuscation of our intent will be required to ensure success.
To conclude, then, Operation Deep Packet Inspection tells us that our use of social media as a market research tool is in its infancy. It will need considerable refinement before it comes anywhere close to yielding the petabytes of analytical data we believed would result.
Sponsored: Webcast: Simplify data protection on AWS