• Use Case: Voice of the Public
  • Segment: Government Agencies
  • Product: Rosette

Mining the Social Web

Introduction

Listening to what customers are saying to your sales reps, in social media, and in product reviews, is not just being customer-friendly, it’s good business:
  • 78% of consumers have bailed on a transaction or not made an intended purchase because of a poor service experience. —American Express Survey, 2011
  • It takes 12 positive experiences to make up for one unresolved negative experience. —“Understanding Customers” by Ruby Newell-Legner
  • For every customer who bothers to complain, 26 other customers remain silent. —White House Office of Consumer Affairs
No surprise here: customer experience affects both top and bottom lines. According to Forrester Research, a modest shift in customer experience for a $10 billion company increases product sales by $64 million, reduces churn by $116 million, and improves revenue $103 million by word of mouth alone.
How is this process working for you? Are the decisions obvious? Or is something missing between the data and the action that still requires a leap of faith?
Customer experience affects both top and bottom lines. According to Forrester Research, a modest shift in customer experience for a $10 billion company increases product sales by $64 million, reduces churn by $116 million, and improves revenue $103 million by word of mouth alone.
Let the words of your customers tell you what information to monitor and which product features matter most to them.
  1. Adding social media to your list of monitored information sources (customer interactions, databases, tweets, support calls, etc.);
  2. Turning your business intelligence and numerical analyses into action by discovering root cause—answering the “why” something happens;
  3. Augmenting or supplanting broad mass-market analysis with deeper, focused analysis of customer behavior;
  4. Letting the words of your customers tell you what information to monitor and which product features matter most to them.
Big data and machine learning technologies are enabling a new class of analytics that discover patterns in customer behavior to give you time to act before market disruption occurs—think Uber and taxis. Not so far in the future, using text analytics to mine the social web will be a business tool as common as a profit and loss statement.

The Analytical Shortfall

Business analytics have come a long way. Mining structured data in databases, applications, and spreadsheets detects patterns and predicts performance. Mining unstructured data in documents, log files, multimedia, and social content determines reputation and makes recommendations.
We know the who, what, where, and when for every data point, but we struggle with the “why” for three reasons, we’ll explore:
  1. We typically analyze numeric data separately from text.
  2. We incorrectly use mass market analysis when we should be focusing on deep analysis of individual customer data.
  3. We use pre-conceived ideas of what data to measure, rather than letting the words of customers tell us what is most important to them.
Your proprietary customer data will have greater value if viewed in the context of the social web. By “social web” we mean Twitter, Facebook, LinkedIn, but also surveys, self-service data, and product reviews. The social web is the epicenter of customer thought, and it has one salient characteristic: it is comprised of words, sentences, and paragraphs with little to no defined structure. Your key to unlocking that content is text analytics.
The social web is the epicenter of customer thought…comprised of words, sentences, and paragraphs with little to no defined structure. Your key to unlocking that content is text analytics.

Connect The Dots: Numbers to Words

Let’s explain through example. We analyze sales figures for the last several years by product, sales territory, and price range, which exposes patterns in the distribution. We learn Product X is selling poorly in the Northeast. The key question is: why? If we apply sentiment analysis to customer comments, we might hear that Northeast clients are unhappy with the quality of customer support. This qualitative information tells us that we should review our support team and make changes.
Integrating analyses for both numbers and text reveals the connection between cause and action.
Notice what we did here. We brought together two traditionally separate disciplines: business intelligence/data mining (analysis of numbers) and text analytics (analysis of text). Separately, they provide better detection but do not lead to action. Integrating analyses for both numbers and text reveals the connection between cause and action. We are able to make the causal connections between facts and opinions.

Look Deep, Not Wide

Although predicting financial risk is a core competency of the financial industry, the economic collapse of 2008 caught it by surprise. Why? Because analyses told them it was statistically impossible for everyone to default on their mortgages at the same time. The flaw was analyzing the market in aggregate rather than on individual performances. Yes, one should study the market as a whole, but it’s an incomplete picture. In aggregate, it’s statistically plausible to bet against a market-wide default, but deep analysis of individual performances might have signaled otherwise.
We have a penchant for analyzing data at a mass market level, treating everything like fast-moving consumer goods. Mass market analytics are familiar and easier to understand than deep analysis at the individual level. It works when volume is high and value is low, for example selling toothpaste. Analytics are at a gross level, so the law of large numbers and its body of analytics is appropriate.
The opposite is low volume and high value. Here, we are forced into deeper analytics because we don’t have the high numbers for adequate trending; we study the customer rather than the market. But what if you have both high volume and high value, for example selling a car? In practice, we choose mass market analytics because it appears to work, and because the cost of deep analytics across such a large volume of customers is prohibitive. Unfortunately, we’re wrong on both counts.
Market trends are useful of course, but car manufacturers also want to know how each car owner feels pre- and post-sales. If you want to “make the car better,” go deep to understand your customers’ experiences. Going deep means analyzing individual voices, often from the social web.
If you want to “make the car better,” go deep to understand your customers’ experiences… [That] means analyzing individual voices, often from the social web.
A few best practices to consider:
  • Apply the analysis appropriate to each problem. For pharmaceutical drugs, predicting sales volume and demand is an analysis of mass-market behavior, but managing legal risk requires deeper investigation. What are people saying about side effects?
  • Add context from social media to your own data. Hotels personalize their relationships with their customers based solely on their interactions with them. You may have entry-level status with a hotel chain, but a cursory glance at your airline status, business seniority, and travel frequency might suggest greater potential. Where your own data is thin, social media could reveal who the customer might become for you.
  • Recognize subjective statements. Consider this comment from a car owner, “I close my eyes and I know where everything is.” Or, “It’s just not me.” There is nothing objectively positive about the first statement, or objectively negative about the second, but in context, these are opinions. These subtle comments are often the true indicators of user opinion, which become the triggers for purchase decisions.

Measure The Right Stuff

In the late 1970s Ford embarked on a project to introduce a fuel-efficient car that would leapfrog the smaller, efficient Japanese imports increasingly dominating the market. After extensive research, they reduced their decision to two choices: (1) design and build their own lean-burn engine, or (2) license a 2-stroke engine from Orbital. Ford made a prototype for each and then A/B tested them to get driver feedback. Their testing focused on NVH (noise, vibration and harshness). Why? Because they assumed NVH was most important to car buyers vis-à-vis an engine. Based on this criteria, their own engine won, so in 1980 Ford unveiled the Escort 3, their first fuel-efficient car using a Ford-built lean-burn engine. Building their own engine cost two to three times the cost of licensing the Orbital engine.
Before jumping into analysis, we need to let the data tell us what features and indicators really matter. Machine learning lets the data “speak.”
This is what happened. The lean-burn engine was discontinued in 2004. In 1996, Ford introduced the Festiva, their first car with an Orbital engine, and the engine is still thriving today. It would seem the NVH criteria was not a guarantee for success. Perhaps being eco-friendly would have been better. In general, predicting outcome before understanding the data can easily lead to wrong conclusions and actions.
Before jumping into analysis, we need to let the data tell us what features and indicators really matter. Machine learning lets the data “speak.” In a “pre-analysis” step, machine learning software analyzes textual information to pull out frequently occurring themes and features from customer opinions. This step reveals the criteria which should be measured, in what context, and at what level of importance.
A few best practices to consider:
  • Define your purpose objectively. Often we inject our own biases into our analyses without meaning to, especially when we assume we know the answer. Try to be surprised instead. Assume the outcome will be different, or be open to the possibility that the data doesn’t fit the model.
  • Carefully select your data. While the broadest set of sources may not ultimately serve you, start with breadth to find the signals in the noise, and then narrow the list and go deep on the key data sources. Don’t shy away from complex sources because you think the statistics will be too complex. Choose an analytics platform that supports your requirements.
  • Adopt an iterative process. Jump start your model with examples that you seek. It should learn from your data automatically. Next, give it declarative rules. The system should be adaptable for correction and new knowledge.
  • Align your findings temporally. Consider, “I first started Xanax 6 weeks ago.” This person’s opinions will be different than when they began taking the drug. By aligning your data across a time scale, patterns emerge.
  • Tailor your model. Customers talk about things differently, so choose a social media analysis platform that lets you adapt the model to understand your customers’ conversations. There should be tools for your engineers to easily modify the model and enable the model to learn from the data.

Final Thoughts

Malcolm Gladwell wrote The Tipping Point in 2000. The book proposes that deeper understanding of what “tips” a trend into widespread popularity lets us more accurately predict the success or failure of future trends. Gaining this deeper understanding is similar to getting a better grasp of your customers and markets. Subtle insights, uncovered subjectively through analytics, can validate your objective purpose. They become your market-changing indicators from which you trigger strategic actions, placing you in front of your market, anticipating and planning for change.
The science of predictive and prescriptive analytics is in its infancy, so acting now—to learn more about this new tool—gives you first-mover advantage. The need for deeper, richer, immersive analytics is growing exponentially as our industry melds information sources—databases, applications, social networks, file systems, logs, media—into massive “data lakes.” New data science techniques such as machine learning let us navigate data lakes to discover causal relationships and predict events. Today’s computing power can do all of this across great volumes of data, to mine deep and wide.
One final thought: conventional logic suggests there is a natural limit to how much gum you can chew. Turns out there isn’t. People will increase their gum consumption until their pocket or handbag is empty.

Dark Web

News Monitors' dark web monitoring keeps an eye on your data so you don't have any data breach of any user or client credential. Constantly measure your baseline exposure and strengthen your compliance posture. Monitor breaches as-they-happen; Monitor incident reports and provide valuable domain expertise and offer immediately actionable next steps for your organization.