Here are the answers to the questions you sent us:
By — Scott Wilder, Gary Angel and Marshall Sponder
As usual I enjoyed the recent Social Media Measurement webinar – and it was great to have Marshall on as well. Tools always draw a crowd and this was no exception. Here’s the questions we got along with our joint answers…
Question: What tools are best for measuring social media ROI or business lift, with respect to advertising on Facebook, Twitter, Linkedin, etc
Marshall: There’s actually a new platform launching next week called Unified (UnifiedSocial.com – I will be at the launch) that promises to do something like that – I’ve seen the platform close up and I can tell you I am impressed. It may be that 2012 will be a year where ROI will no longer be a totally elusive goal for social media.
Gary: This is far more difficult, I think, than people generally believe. The only easy path to ROI measurement is when user’s are either directly engaged in commerce on social sites (which is rare) or are directly clicking through to sites where they are engaged in commerce. In these cases, measurement is generally a straightforward application of existing Web analytics campaign tracking capabilities. Unfortunately, this isn’t often the case. In some cases, I’m not even sure that ROI is the proper path to measurement and where it is, I don’t think there is likely to be one answer or approach. If your Facebook advertising is directed toward increasing your Fanbase, you need to be able to measure the incremental value of Fan (and this won’t be one value by the way) to your marketing. Getting that measure takes a concerted research effort and won’t (in my opinion) be delivered by any single tool. I sometimes think that it might be better for organizations to – first glance – concentrate on the obvious optimizations points. It’s much easier to measure which campaigns generates engaged Fans and calculate their cost-efficiency in that respect. You can then optimize campaigns within the set of those targeted toward increasing your fanbase. It’s not ideal, but it is more practical.
Scott: In most cases, companies have to guestimate true ROI because of some of the limitations of the tools and also companies own infrastructure. I find it useful to create proxies – like determining cost estimates for certain activities, which in turn, would lead to a transaction.
Question: US cost is too high – example Engage121 is $1000 per month for first base level search – one profile with 3 seats.
Marshal: Well, as Gary pointed out, Engage121 is designed for a specific use case and type of client such as an airline or large franchised business with thousands of stores that each want a different response and editorial controls – think Dominos or Dunkin Donuts (though I think neither are Engage121 clients). My point being, you can’t take the price of a platform in isolation from the use case and clients for whom it is designed and targeted to. The Dominos and Dunkin’s of the world have plenty of money and need for this kind of platform – but if your looking for an “affordable point of entry” into Social Engagement- than go with HootSuite and be happy there are still some free platforms you can play with and get your feet wet.
Gary: Not every market is going to be served by a tool like Google Analytics – free and really good. I basically agree with Marshall here. One thing I will say that’s more general is that in my experience some pricing models are much worse than others for doing serious enterprise work. To do our kind of measurement (Semphonic) we need a pretty free hand to construct, test and use profiles of all sorts and we generally need quite a lot of them because all the interesting questions involve categorization. At the enterprise level, I’d much rather pay a significant lump sum for a pretty free hand with the data than have a pay-per-item model. Pay-per-item models tend to cripple analysis.
Question: Do you have preference for tools to measure public opinion about political candidates – public policy or litigation issues?
Marshall: Yes, I am working with one right now – 6Dgree.com – we are tracking two candidates in Rhode Island and breaking down their overlapping audiences – along with “persona” breakdowns of their twitter streams – here is what that looks like (I erased the names of the candidates because this is still in the very early exploratory stage of what works).
So far, the persona development breakdown looks impressive, as we can break it down by various sub dimensions and the founders at 6Dgree are very willing to pursue my suggestions, which really impresses me about them. So yes, as of now, I believe 6Dgree might have a winning platform at an affordable price level that works for Twitter and Facebook. Another is PeekAnalytics, but it’s not adapted specifically to Politics, yet.
6Dgree has done some interesting work with Australian Labor party around issues and produces a weekly portal report that breaks down tweets around several issues – I’m impressed with the solution, but of course, each campaign is slightly different and customization will always be a fact of life.
Question: What are the better tools for global internal scale? If any? Or just by world region?
Marshall: I like Comscore Media Metrix for world reporting – but that’s mostly panel based reporting -but it does a fairly extensive job of categorization of lifestyle and interest across channels, countries and technologies such as video, mobile and search.
Gary: Ditto Marshall. I like NMIncite for many larger markets. Alterian provides excellent language coverage.
Question: Do you believe the sampling of data should include statistical testing? Or how do you ensure your sampling is reflective of the entire population to provide confidence in the recommendations?
Marshall Well, Gary has a pretty good post on that, written recently, and I think, rather than speak to it, I’ll let Gary address it http://semphonic.blogs.com/semangel/2011/11/the-limits-of-machine-analysis.html
Gary: Thanks for the plug! Let me know if the several blogs I’ve written on the subject don’t fully answer the question! Social Media Measurement is an odd blend of attempts to get universal coverage and hidden samples – which makes a single approach challenging. You can use statistical testing to measure the variations in your samples and, where possible (it isn’t at all levels) that’s certainly advisable.
Question: When one wants to search and analyze Twitter postings and the topic is very low salience, so likely a very, very small percentage of Twitter mentions in U.S. in a given week, what are the best ways to maximize the amount of Twitter Firehose that you search to catch as many Twitter postings on your low salience topic as possible?
Gary: Depending on your method of access, you might want to start by talking with your vendor (if you’re using a vendor to make the initial data pulls). The initial pull is often tunable. This also speaks to your ability to capture the topic in all its forms. Traditional keyword research of the type often done for long-tail SEO can be useful. There is a range of tools appropriate for this – we’ve also just used scanning tools to pull the text off of sites (both client Websites, communities, and competitors) to try and build rich topic profiles. You can also take advantage of wildcards (in some tools) to scan from hash tags that include but are not limited to your topic. Hash tag references are often concatenations of the topic with other words and are nearly always pertinent. Sometimes, too, you have to be creative about what you’re looking for. If, for instance, you’re launching a product that is distinct, you can’t expect to identify potential influencers by targeting the obvious words – they generally won’t have any traction. So you have to look for analogs that might allow you to find and target a reasonably set of influencers.
Q: Any views on Netbase, which SAP just partnered with?
Marshall: Yes, it seems like a good partnership. Netbase does a pretty good job at NLP and creating structure and meaning around unstructured social data, and rather than SAP trying to build that (or buy Netbase, which is an option) they just partnered with them.
Scott: Netbase is doing some really interesting stuff, especially when it comes to Netnography (see www. Netnography.com). I think the partnership with SAP will be good because I know that the company is putting a lot of energy into understanding their own segmentation better. We are doing some work for them right now. SAP is also making a big push in mobile analytics and would probably pull Netbase into.
Question: Gary, perhaps you could ask each speaker to summarize which tool they think is strongest in each of the three key use cases you’ve outlined?
Marshall: Here’s a list of companies to consider
- For PR Effectiveness – I’d say mPACT and Cision.
- For Consumer Sentiment – I would recommend be NetBase (in fact) for its NLP capabilities.
- For Social Campaign Effectiveness – Unified (once it launches)
- For PR Effectiveness: NMIncite – though it does a poor job with identifying influencers the segmentation is excellent for tracking them.
- For Consumer Sentiment: Clarabridge and Crimson Hexagon – though we haven’t gotten to use Crimson Hexagon as much as we’d really like.
- For Social Campaign Effectiveness: This is a tough one. Most of the new management tools provide some integrated reporting – but I think that really good effectiveness measurement demands that level of reporting plus Web analytics, plus traditional listening configured for the purpose, and maybe CRM-based extracts at the individual level as well (we sometimes analyze Facebook campaigns by extracting all the individuals and looking at their pre/post behavior).