More on New Data Sets (A Nod to Gary King)

The previous post on curation referred to the enormity of data that continues to be available for processing. Among the many useful bits of info in this presentation by Harvard prof Gary King is the run-down he gives on slides 21 and 22.

While his approach is certainly academic, the company that has sprung from his work–Crimson Hexagon–certainly is some evidence of commercial application.

The other key points for me are the need for tech and human methods of analysis to be used in a complementary way. So much momentum seems to be toward tech solutions, the human role (the curator, if you will) is paid too little heed. In fact, that’s how the linked presentation ends:

“Will we wait to be replaced?” Or will we adapt?

Leave a Comment

What Can Market Research Learn from Journalism?

Market research and journalism have a lot in common. The requisite curiosity, persistent investigation, and knack for storytelling are threads that connect my own biography—from college newspaper editor to history grad student to researcher by trade.

The journalist, of course, occupies a pop culture space of much greater visibility; and the savvy researcher can look there for hints of the future.

This recent Mashable piece on curation struck a particular chord for me. The basic premise is this:

“Over the past few weeks, many worries about the death of journalism have, well, died. Despite shrinking newsrooms and overworked reporters, journalism is in fact thriving. The art of information gathering, analysis and dissemination has arguably been strengthened over the last several years, and given rise and importance to a new role: the journalistic curator.

With a torrent of content emanating from innumerable sources (blogs, mainstream media, social networks), a vacuum has been created between reporter and reader — or information gatherer and information seeker — where having a trusted human editor to help sort out all this information has become as necessary as those who file the initial report.”

There are some important parallels, the most important of which is an increasing load of content that is user generated, free, and growing exponentially.

Like consumers of the news, many businesses are ill-equipped to manage the torrent of information that is flowing their way, learning on the fly how to use the bevy of new tools available to help manage it. Like consumers of the news, businesses increasingly expect cheap information and have a hard time evaluating the quality of the source.

In terms of skill set, the best researchers should be able to incorporate curation pretty seamlessly into their portfolio. The very words “curate” and “research” suggest the combination of art and science that has defined market research as a discipline. The ability to apply quantitative discipline to qualitative learning (and conversely, to explore nuances of data in an unstructured way for deeper insights) is critical to using Big Data.

But it requires a shift in orientation, and a different paradigm of what you can and cannot control.

The shift in journalism has taken a painful toll on many of the employees as big media companies struggle to adapt. The research industry historically operates a bit behind the curve.

What else can researchers learn from what’s happening in the media biz?

What Can Market Research Learn from Journalism?

Comments (1)

No Man’s Blog on Problems with Social Media Monitoring

Nice post here by Asi Sharabi on some of the shortcomings of social media monitoring services.  If you sum up his analysis to “they don’t work like they’re supposed to, they take too much time, and they’re too expensive,” it comes off as a little trite.  He didn’t sum it up like that, of course, but that’s the gist of it.  I haven’t seen as many services in action, but I don’t think he’s too far off-base.

The comments here are worth reading, because a few things emerge.

1) He gets little argument.  Comments tend to agree with his anaylsis.  It’s unclear whether it’s a surprised agreement or recognition of familiar but unarticulated sentiment—I think a little of both.

2) As Jeff Scott points out, given how many monitoring companies have responded, “we know they’re drinking their own kool-aid.”

3) It’s a nascent technology.  Sure there will be growing pains, but data are there and knowing how to use them is going to be important.  You’ve got to start somewhere.

4) What really sparked my interest was the observation of the intersection between market research and social media monitoring, both in the problems they address and the tools they use.

How the value proposition can be worked out remains to be seen, but I think monitoring services have utility, even in their current state.  I have not seen a service that will autonomously deliver meaningful results, but in the hands of a capable user, even the current technology—with all its limitations—can be valuable.

I thought Brian Johnson really nailed it:

You didn’t really state what your goals in using those platforms. While there are use cases for monitoring (crisis management, directly engaging influencers, etc), I believe that the real value is thinking of social media like a massive dataset, much like what CRM has evolved to. It’s an incredible dataset, when you sit down and think about it, offering based on the sheer volume, authenticity, and real-time nature of the data. There is amazing value to be had by performing in-depth analytics on that data and using it to inform strategic marketing decisions. Far greater than simply counting how many times your brand is mentioned, and whether it’s good or bad.

Two examples  from a couple local (Kansas City) vendors (who weren’t mentioned in Asi’s hit list).  I’ll say upfront that these reflections are impressionistic.  I don’t have deep experience with either, but I’ve learned a bit about them, and here’s what I was left with:

Social Radar/Infegy’s game is mass data aggregation.  By collecting feed data (as in RSS) and warehousing it, they are able to maintain a remarkably consistent database.  Will it still include spam?  Sure.  Will it miss some important (non-RSS) conversations?  Yep.  But the theory is that you’re accumulating so much information and in a consistent manner such that trends over time will still be meaningful.  It’s not foolproof.  I’m unaware of any studies indicating that variation in spam conversation correlates with actual conversation, but it also seems a reasonable hypothesis.  There will be error in any dataset; acknowledgement and consistency seem a step in the right direction.   Though such monitoring wouldn’t meet all needs, I can see how it would meet some.  The other advantage of the warehousing approach is that from the moment you start, you can look at historical data.  Social Radar seems like a good approach to quantitative analysis.

Spiral 16’s approach, on the other hand, achieves consistency by limiting the universe.  Whether their precise algorithm for identifying the right ecosystem is the best one, I’m not sure; but the case for doing a restricted search is compelling.  Find the relevant web.  Even if you miss some sites and conversations, monitoring an 80% accurate ecosystem can have value.  And categorizing types of conversations (traditional media, blog, video, etc.) seems especially useful.  There may be some back-end work to make sure these categorizations are accurate but once defined, there’s a lot of value in knowing how conversation about your brand is happening.

Leave a Comment

The Role of PR in Social Media Outreach

Here’s another perspective on disciplinary responsibility for social media (via @TDefren via @Aerocles on Twitter).  There’s a four-point argument for social media marketing as the natural domain of PR professionals.  The most concise summation can be found in the comments (thanks, Stuart Foster):

If we do our jobs right, Social media will no longer exist. It will simply be known as PR (and live within that category). The concept of social media belongs to outreach, engagement and corporate communications strategy.

The tactical parts? Can be handled by customer service/community managers. The overarching control should go to PR though.

Okay, I said “another perspective,” but really this is another angle to the same perspective, which is that—for all the newness of social media—traditional marketing disciplines nurture the qualities necessary to effectively manage a new channel.  The outreach and communication functions are as natural to PR as the monitoring function is to market research, which I think coincides with the research vs. action divide Whitney brought up in her comment.  Aerocles doesn’t touch the monitoring aspect—not sure if that’s intentional or not.

Comments (3)

The Role of the Researcher in Social Media Monitoring

Spiral16 put this poll on their blog recently:

Who do you think is best suited to handle social media monitoring for a company?

  • Social Media Software Provider
  • Public Relations Firm
  • Advertising/Digital Agency
  • Direct Company (MarCom or Other Department)
  • Social Media Consultant/New Media Agency

This is a pertinent question, not just for brands and brand managers trying to settle on a vendor, but for “integrated” agencies where the responsibilities in the Spiral16 poll may or may not be clearly assigned. Leaving aside the issue of disciplinary silos vs. multidisciplinary collaboration, there seems an obvious omission from the prompted answer set: a market research consulting company.

Clearly, I come at this with a certain bias. And I know certain natural characteristics of the market researcher—a cautious approach to new methods, a generally poor track record of marketing themselves—have helped to cede this ground to ad agencies, PR firms, and new media gurus.

But the skills of the researcher really dovetail nicely with social media monitoring (SMM) in several ways that are clearly missing from the conversation.

  1. SMM is a measurement tool. Customer satisfaction, attitude and usage studies, market tracking…all are regular tools in the researcher’s kit. The biggest difference with social media is that the researcher is not instrumental in generating the content. This difference has some implications for analysis, but the overall analytical framework is similar. And experience with large, ongoing datasets—as well as traditional methods of brand tracking—can only help make SMM more effective.
  2. SMM is a form of listening. Your direct marketer, your advertising creative, your PR pro—their job has traditionally been to deliver a message (and hope it translates to action on the part of the consumer). The researcher’s job is to listen to what people say (and hope it translates into action on the part of the brand).
  3. SMM requires a deep understanding of how to use quantitative and qualitative data. The amount of data covered by SMM services is unfathomable. Data sets are where the researcher operates. The business objective dictates research methodology, and the same is true for SMM, which can yield an absolute number of brand mentions per week, a single serious complaint about a product, or a detailed review by a heavy influencer—depending on the approach (and the SMM service). Market research has always involved these kinds of negotiations.
  4. SMM demands you know who is talking. Good research data always begins with knowing your universe and understanding your sample. The biggest flaw in new media studies (and sometimes monitoring services) is a lack of transparency about where the data come from. You may have 80% positive sentiment or have doubled your web chatter from last month. But 80% of what? And who is chatting?

There are a lot of smart people in social media, without research backgrounds, who are dealing with SMM the right way. And those in the research business, some late to the game, have their own learning curve. But overlooking the experience of research professionals stands to make the curve sharper for everyone.

Comments (2)

The Art and Science of Market Research

Here’s a question (and answer) posed by the thoughtful Gavin Johnston, Chief Anthropologist at Two West, on a LinkedIn market research message board:

Q: Do demographics and psychographics obscure our understanding of our customers, users, clients?

A: I understand that segmentation and quantification are essential to a point, but do they also provide and inherent deviation from understanding context and the complexities of human interaction? I sometimes think that businesses are inclined to obsess over the trees in the forest, so to speak, without thinking about the linkages between the trees, the forest, the entire ecosystem. People rarely function exclusively as individuals, but as part of a shared socio-cultural system. Does the emphasis on neatly categorizing people obscure the larger system?

[After I posted an answer, I thought it seemed like as good a place as any to start beta-testing this WordPress software.  So here goes.]

That’s a pretty big question. First, any type of research or observation only reveals a part of the ecosystem…it’s the researcher’s job to relate the specific study result to the whole. But the question posits a more actively negative function–it’s a bit different to obscure reality than only to reveal a part of it.

Businesses absolutely obsess over statistical trees. Look at the way good measurables might point towards short-term stock performance, but not necessarily reflect a sound business plan and management team poised for long-term growth. In another field entirely, Bill Simmons had a column recently bemoaning the state of basketball statistics, wondering why so many intangibles remain uncaptured, despite the advances of baseball’s sabremetrics. Basketball skill, he contends, can be measured much better by a trained observer than by piles of statistics.

Or check out this Nation column decrying the use of evolutionary psychology and (gasp!) quantitative measurement in literary criticism: http://tinyurl.com/ll4e4z. Sample quote: “[The literary Darwinists’] goal is not only to reseat literary studies on a basis of evolutionary thinking but to found a “new humanities,” as the title of one book puts it, on scientific principles: empirical, quantitative, systematic, positivist, progressive.”

All of the increased computing power has put enormous quantities of information at our disposal, and with it, the expectation we can find the little bit of information we need when we need it. But that also leads to forcing square pegs into round holes. On a practical level, that might mean taking a Claritas Prizm segmentation and imposing it on your customer base. If you’re looking to do a direct mail campaign, maybe that’s okay. If you’re trying to gain new insight, probably not.

But where technology has confused us, I’m hopeful it provides an answer, or at least one answer. The field of social network theory and the software developed to monitor social media are both concerned with linkages and ecosystems and relationships. Just now are we starting to apply those models to more conventional data sets.

The other solution is much more low-tech–good research is part science, surely, but part art. The tension between the humanities and the social sciences is an interesting one. Art (and the humanities) are a harder sell, for sure. Just ask a high school senior. But I don’t think they’re less valuable. Does an overemphasis on science “obscure” one’s openness to art? Perhaps. But data doesn’t segment itself, even with SPSS. You need a person at the other end who can piece together the story, paint the whole picture. That can’t always be done with numbers alone. The trick, as with the Old Masters, is finding a good patron.

Comments (1)

« Newer Posts