Home > Blog

Social Media Sentiment Monitoring, how acurate is it and should it be automated?

When looking at brands with a large amount of coverage the sheer volume you have to deal with counts. I can see that if you are dealing with very large brands in a fast moving environment that waiting for it to be done by humans could take a long time and you could be overtaken by events. Nickburcher.com sets out why automated Social Media Sentiment can be usful http://www.nickburcher.com/2008/10/brandwatch-social-media-sentiment.html

Social media monitoring and sentiment tracking is becoming more important for brands. Anyone can publish anything now and if a citizen journalist mistakenly declares that your CEO has had a heart attack (Apple’s share price tanked before climbing back up) or a random reporter mistakenly informs Bloomberg that you are going into bankruptcy (United Airlines company value dropped by $1bn), then you need to know quickly and respond accordingly.

– Volume of posts analysed is different at each point so it’s important to consider how this affects results
– What are the sources? Is it across the whole blogosphere (where conservatives tend to be more active) or is it just across Labour supporting blogs?
– Is there an opinion weighting factor? (Based on links and audience, are some opinions given more credit than others?)
– Are the news sources UK only or global?
– What is the context of the sentiment? Is it around personality, policy or events?
– Is there one particular thing driving positive / negative sentiment or is this an average across all mentions?
– Competitive context is also very important (and perhaps the most important factor.) Is this good or bad vs David Cameron / Nick Clegg etc?

Amber Naslund said
We all crave the technology that can automatically tell us whether a post we read and track is positive/negative/neutral (and the holy grail would be something that could make next step recommendations). ………

What I mark as positive:

  • Blatant and direct compliments or recommendations, without competitors mentioned. Can include product compliments or positive statements about service and support.
  • Posts that contain superlatives in direct reference to our company or product (good, great, awesome)
  • Reviews that are clearly complimentary, even if they contain a few improvements we could make
  • If the post is a Digg, Stumble, or Delicious (someone found it valuable enough to vote on or bookmark)
  • Retweets or links to any of the above posts

Somewhat positive:

  • Retweets of our events or publicity (implied endorsement)
  • Posts that announce/feature our inclusion in a list, ranking, or otherwise, including along with competitors
  • Posts that recommend us alongside competitors
  • Inquiries about getting a demo and/or trialing the product (implies good enough impression to ask to see more)
  • Retweets or links to any of the above types of posts

Neutral

  • Any tweets that are company outreach (from our employees). This helps to not sway the snapshot of what our community is saying, for better or worse
  • Links to our website with no commentary at all
  • Passing mentions of us in conversation unless they meet pos/neg criteria
  • Statements like “checking out Radian6″ without other commentary
  • Factual information about our product/brand without reaction or comment (including retweets)
  • Links or retweets to our blog, events, etc. that don’t include commentary.
  • Troubleshooting inquiries that are simply technical in nature

Somewhat Negative:

  • Retweets or links from the community to third-party posts that contain criticism (passive endorsement of the negative content)
  • Posts that contain criticisms of our product or service coupled with compliments or positive statements, if the negative seems to outweigh the positive
  • Sarcastic comments that allude to a negative experience but without a blatant callout
  • Troubleshooting inquiries that include statements of frustration

Negative:

  • Clear criticisms or complaints about our product or service. These are usually pretty obvious.

My question

The thing as a blogger and speaking wiht the bloggers I know bloggers like to give a balanced review, the posative and the negative, how do you automate what is important in the conversation.

“Let me put it another way at Monitoring Social Media 09 (MSM09) Giles Palmer talk was by all accounts exceptional, I only got to see the first 10 mins which was good. On the other hand his moustache for charity was horrendous. ” Note, look out for an interview with Giles here tomorrow.

Now if you took this to have both positive and negative sentiment you would be right. However who cares about what I thought of his moustache, it is his talk that is important. How would an automated sentiment monitoring tool deal with that.

Do you have experiance of automated Social Media Sentiment Monitoring? Do you find it reliable?

8 comments on “Social Media Sentiment Monitoring

  • At Biz360 we have been providing automated sentiment tools for our clients for more than 6 years. In our social media platform (Community Insights) we provide clients both the ability to use the automated sentiment as well as to override or manually score the sentiment. Our experience is that it provides a very important layer of understanding the conversation even if there are occurrences when a single post is scored wrong due to sarcasm or some other issue. When looking at a sample of 20, 50, 100, or 1,000 posts, automated sentiment is a very valuable tool to be able to quickly understand trends in the conversation. We believe the combination of automated sentiment with a manual override provides the bets of both worlds and the most valuable experience.

  • Human analysis is more accurate, but can be cost-prohibitive at high volumes or just not a good value if brands are not measuring individual mentions. Yes, the software will get it wrong 30% of the time, but will still be able to delineate the big picture.

    Most of our clients prefer human analysts but for some who receive several thousand mentions per day, it’s just not possible to read them all even with a team of analysts. So a second application is to use automation for the bulk of high volume mentions, leaving the most important sources or comments to be reviewed by hand.

    Sadly, the debate over sentiment automation (and countless other media monitoring issues) continues to be a distraction as people look for a one-size-fits-all solution where none exists.

    Cheers,
    Hannah Del Porto
    ImpactWatch
    @hcdelp

  • Bran and Hannah

    Thank you both very much for your input. Much appreciated, truly. Has anyone ever done any comparative review work?

    Murray

  • Not to my knowledge. It would be extremely tedious as you would have to have large samples first analyzed by humans, then that sample and the automated sample again analyzed by humans for accuracy.

    I believe the statistics on the accuracy of automation are also estimates. I have most often seen providers of sentiment automation claim to have around 70% accuracy while human analysis of course varies with the human 🙂 but can be expected in the 90th percentiles of accuracy.

  • Murray,

    As you’ve highlighted, sentiment is a key part of the social media monitoring equation. Truth be told, it’s challenging to have 100% accuracy when trying to use technology to tell whether something is positive, negative or neutral.

    That said, the social media technologies used to automatically determine sentiment do a really good job and, I suspect, will improve as the technology continues be improved and refined.

    cheers, Mark

    Mark Evans
    Director of Communications
    Sysomos.com

  • I agree with Hannah’s opinions (above) and would like to add from my first-hand experience. I have done mainstream and social media human sentiment measurement work for ImpactWatch, Cymfony and Biz360. Quality human analysis accuracy should be over 90% and is most useful for key publications with major influencers. Automated analysis is probably around 60-70% accurate. However, Biz360’s automated sentiment wizard produces accuracy around 80%. One problem occurs when human sentiment analysis is outsourced to India or the Philippines. In several instances I know first-hand the accuracy drops to 70% or below – and in my opinion is no better than automated analysis. If accurate sentiment measurement is important, clients should ask: who will be doing the human sentiment analysis? Clients should also have some idea the criteria for positive/negative sentiment.

Comments are closed.