Archive for Market Research

Video Analytics Are the New Future

In a post a couple weeks ago, I referenced the SKC Tech Summit as an example of a conference outside my field by trade but inside my field of interest. Though it wasn’t an explicit theme, the idea of a “video revolution” certainly loomed large. Topics like telemedicine and virtual education were highly visible throughout the event.

Market research and customer insight was not—although companies like TASKE emphasized an analytical approach to call center software that included rudimentary service satisfaction surveys. Not market research, per se, but a good indication of how integrated insights are adopted by non-traditional market research departments.

Video has been a staple of focus groups for years, of course, and focus group facilities have mostly used advancing technology to help present the work to those who can not be there in person rather than change the way the work is done. QualVu does a great job innovating around video—exploring new techniques and making it more accessible, more diverse, and more entertaining on the report-out side. But as the name suggests, it continues to be primarily qualitative.

What struck me at the SKC conference was the potential for qual-quant hybrid techniques through the usage of video analytics. There are some facial recognition and eye tracking software programs out there that edge towards this idea, but I’m talking about teaching computers to analyze visual motion data, rather than just text/voice or still images.

Considering that text analytics and still image analysis are still immature, I expect video analytics to be at least two technology leaps away—but it’s not too early to get in position to lead in this area.

Take large scale in-home ethnography as an example. QualVu does in-home “ethnography” on a small scale. I put ethnography in quotes, because the respondent is pretty interactive with the camera—it’s more about them showing than you observing. And small scale because the analysis is intensive enough (and qualitative enough) that traditional qualitative sample sizes are both effective and practical.

But imagine setting up three video cameras in 1,000 kitchens for a week. Or two days a month in different households for a tracking study. Quant sample size. Multiple cameras to capture different angles. The client could be a CPG company, a grocery store, a cookware company—you name it. Sophisticated visual analytics could potentially tell you:

  • What’s your morning coffee routine?
  • How much time do the kids spend in the kitchen?
  • How many of those fancy knives do you actually use?
  • Are you drinking wine while cooking dinner?
  • Are Ziploc bags a replacement for Saran Wrap or Tupperware?
  • How many trips to the pantry are required for each meal?

Yes, you can ask things like “Do you clean as you go or wait until the meal is finished?” on a survey, but that sort of misses the point of ethnography. You don’t always know the right question to ask. It’s less about testing hypotheses than uncovering latent needs and motivations. Pervasive video makes observation of people in their natural environments possible in a way that it has never been before.

“Pervasive video” immediately suggests Big Brother (uh, maybe not this one) and privacy concerns. Big Brother or no, privacy has changed. I won’t even get into the respondent confidentiality aspect that consumes much of the traditional market research world. The fact is, individuals have shown an increasing willingness to broadcast their lives that shows no sign of abating, regardless of what Chuck D thinks.

The technology, on the other hand, in not there yet. Talking to people at Cisco, video analytics is on the radar, but not yet a high priority. The more pressing analytic need is to search for text in video, search-engine style. This technology—think automated transcription—is available but still a bit clunky. And of course, creating the broadband and hardware infrastructure to easily facilitate web-based video transmission is a project that is also just underway.

But as we know, technology moves quickly, and I have no doubt video analytics are coming. A five- to ten-year time horizon would not surprise me.

And once we have sophisticated visual analytics, the Big Data explosion will reach another order of magnitude. Potential applications, that may be easier to imagine than voluntary in-home surveillance, include:

In-store video cameras that can provide comprehensive insight into the shopping process, and combined with web analytics, give great insight into online/offline integration for retailers

  • Evaluating B2B performance and client relationships as videoconferencing skyrockets
  • Mobile streaming/life casting—which is already happening but will become more comprehensive as video becomes easier and cheaper to upload

And for those outside the Kansas City area, know that Google is launching a Fiber network here in 2012 that promises to make uploading (and downloading) video much, much faster.

What other applications do you see for insights and analytics as video becomes more prevalent? How far off do you think we are from video analytics?

Advertisements

Leave a Comment

Authenticity Is a Two-Way Street

Eric Melin and Mike Brown have a couple good posts on Scott Monty’s visit to SMCKC last week, which was really a pleasure to be a part of. He deftly mixed in social media truisms (“business strategy, not social strategy”) with original and inspiring campaign executions.

But a couple things struck me beyond the straight social components of the presentation.

First, Scott was the consummate brand representative for Ford and served as an unusual example of the critical relationship between social media and authenticity.

Usually, “authenticity” in social media means that a brand lets its hair down and interacts with people as people. It means cutting the corporate brand-speak and actually engaging. As Scott himself pointed out, people want to be spoken to like human beings.

Neither Average Joe nor hipster guru

Still, you never got the feeling that Scott Monty was the average guy, just keeping it real with the customers. Nor did you feel like he was some hipster creative marketing guru. You felt like he was Ford—a precise blend of heritage, comfort, forward thinking, and approachability. But also that he was genuinely, authentically Scott Monty.

Companies always want to hire good people, but in a social world, hiring the kind of people you want to be is more important than ever.

Another thing that stood out is how Ford uses conventional market research tools in addition to digital metrics to measure the effectiveness of social media campaigns and understand how they work.

Surveys may be out of vogue in a world of sentiment ratings and Klout, but Ford measures trust, quality perception and favorability ratings to understand how social media can have an impact beyond the sliver of its customers who follow @FocusDoug on Twitter or Like him on Facebook.

If social media truly is intended to support broader business strategy, it’s important to take a holistic view of insights and analytics, and it’s great to see Ford really taking that to heart.

Leave a Comment

Three New (Old) Reflections on Data and Analytics

The advance of technology is often perceived as the new continually replacing the old, both in terms of products and human expertise. Implicit in this Inc. Trend Watcher piece is the ability of technology to foster mentorship and knowledge transfer between “old school” experience and young talent. (h/t to @RockhillStrat)

Key quote from “NextGen” market researcher Tom Anderson reflecting on the 2011 MRIA Conference: “We need to become more than traditional researchers while retaining the methodological principles which have served us well for many years.” (emphasis mine) These principles are a huge value-add to new wave analytic approaches, but the key is effectively communicating that value.

@joegermuska posted this critique of McKinsey’s health care study with a caution to journalists to be more stats savvy. But the caution is warranted for anyone who interprets and relays data. There is a lot of upside to Big Data, but a lot of potential for misinformation as well. And you don’t necessarily need to be a stats geek to navigate data, but you do need to have a firm grasp of what to look for to make sure your data is saying what you think it’s saying.

Leave a Comment

Curation, The Mix Tape, and Digital Immigrants

I’ve been preoccupied with curation lately, and it has struck me in a number of different contexts—like the other day listening to Pavement’s Slanted and Enchanted.

My first introduction to Pavement was on a tape a college roommate made—an early ‘90s indie rock compilation. This guy was a careful master of the mix tape, sometimes by label, sometimes by artist, sometimes by mood. Everyone used to make mix tapes—a cultural trope that defined a generation.

People still make mix tapes, of course, but they’re no longer tapes. I compiled some songs for my family last Christmas—downloaded a handful mp3s I didn’t already own, click-and-dragged together a playlist, burned it onto several discs. The whole process took maybe an hour.

I don’t know how carefully kids these days organize their musical tastes, how much they tailor the compilation according to their audience or the desired results. There may be just as many quality “mix tapes” out there now as there ever were. But they float in a much larger sea of playlists, Pandora stations, and iPod shuffles that make things easy but not necessarily better.

The ability to easily access whatever we need, or to explore something new without even knowing what we want—these opportunities offer endless possibility. But they don’t necessarily reinforce the discipline of sorting, prioritizing, organizing, and composing.

These are the challenges of the information age: identifying high-quality inputs, matching them to the right needs, and presenting it all in a way that makes it relatable to your audience.

And there is no shortcut to cultivating the skills needed to meet them.

Leave a Comment

What Can Market Research Learn from Journalism?

Market research and journalism have a lot in common. The requisite curiosity, persistent investigation, and knack for storytelling are threads that connect my own biography—from college newspaper editor to history grad student to researcher by trade.

The journalist, of course, occupies a pop culture space of much greater visibility; and the savvy researcher can look there for hints of the future.

This recent Mashable piece on curation struck a particular chord for me. The basic premise is this:

“Over the past few weeks, many worries about the death of journalism have, well, died. Despite shrinking newsrooms and overworked reporters, journalism is in fact thriving. The art of information gathering, analysis and dissemination has arguably been strengthened over the last several years, and given rise and importance to a new role: the journalistic curator.

With a torrent of content emanating from innumerable sources (blogs, mainstream media, social networks), a vacuum has been created between reporter and reader — or information gatherer and information seeker — where having a trusted human editor to help sort out all this information has become as necessary as those who file the initial report.”

There are some important parallels, the most important of which is an increasing load of content that is user generated, free, and growing exponentially.

Like consumers of the news, many businesses are ill-equipped to manage the torrent of information that is flowing their way, learning on the fly how to use the bevy of new tools available to help manage it. Like consumers of the news, businesses increasingly expect cheap information and have a hard time evaluating the quality of the source.

In terms of skill set, the best researchers should be able to incorporate curation pretty seamlessly into their portfolio. The very words “curate” and “research” suggest the combination of art and science that has defined market research as a discipline. The ability to apply quantitative discipline to qualitative learning (and conversely, to explore nuances of data in an unstructured way for deeper insights) is critical to using Big Data.

But it requires a shift in orientation, and a different paradigm of what you can and cannot control.

The shift in journalism has taken a painful toll on many of the employees as big media companies struggle to adapt. The research industry historically operates a bit behind the curve.

What else can researchers learn from what’s happening in the media biz?

What Can Market Research Learn from Journalism?

Comments (1)

The Role of the Researcher in Social Media Monitoring

Spiral16 put this poll on their blog recently:

Who do you think is best suited to handle social media monitoring for a company?

  • Social Media Software Provider
  • Public Relations Firm
  • Advertising/Digital Agency
  • Direct Company (MarCom or Other Department)
  • Social Media Consultant/New Media Agency

This is a pertinent question, not just for brands and brand managers trying to settle on a vendor, but for “integrated” agencies where the responsibilities in the Spiral16 poll may or may not be clearly assigned. Leaving aside the issue of disciplinary silos vs. multidisciplinary collaboration, there seems an obvious omission from the prompted answer set: a market research consulting company.

Clearly, I come at this with a certain bias. And I know certain natural characteristics of the market researcher—a cautious approach to new methods, a generally poor track record of marketing themselves—have helped to cede this ground to ad agencies, PR firms, and new media gurus.

But the skills of the researcher really dovetail nicely with social media monitoring (SMM) in several ways that are clearly missing from the conversation.

  1. SMM is a measurement tool. Customer satisfaction, attitude and usage studies, market tracking…all are regular tools in the researcher’s kit. The biggest difference with social media is that the researcher is not instrumental in generating the content. This difference has some implications for analysis, but the overall analytical framework is similar. And experience with large, ongoing datasets—as well as traditional methods of brand tracking—can only help make SMM more effective.
  2. SMM is a form of listening. Your direct marketer, your advertising creative, your PR pro—their job has traditionally been to deliver a message (and hope it translates to action on the part of the consumer). The researcher’s job is to listen to what people say (and hope it translates into action on the part of the brand).
  3. SMM requires a deep understanding of how to use quantitative and qualitative data. The amount of data covered by SMM services is unfathomable. Data sets are where the researcher operates. The business objective dictates research methodology, and the same is true for SMM, which can yield an absolute number of brand mentions per week, a single serious complaint about a product, or a detailed review by a heavy influencer—depending on the approach (and the SMM service). Market research has always involved these kinds of negotiations.
  4. SMM demands you know who is talking. Good research data always begins with knowing your universe and understanding your sample. The biggest flaw in new media studies (and sometimes monitoring services) is a lack of transparency about where the data come from. You may have 80% positive sentiment or have doubled your web chatter from last month. But 80% of what? And who is chatting?

There are a lot of smart people in social media, without research backgrounds, who are dealing with SMM the right way. And those in the research business, some late to the game, have their own learning curve. But overlooking the experience of research professionals stands to make the curve sharper for everyone.

Comments (2)

The Art and Science of Market Research

Here’s a question (and answer) posed by the thoughtful Gavin Johnston, Chief Anthropologist at Two West, on a LinkedIn market research message board:

Q: Do demographics and psychographics obscure our understanding of our customers, users, clients?

A: I understand that segmentation and quantification are essential to a point, but do they also provide and inherent deviation from understanding context and the complexities of human interaction? I sometimes think that businesses are inclined to obsess over the trees in the forest, so to speak, without thinking about the linkages between the trees, the forest, the entire ecosystem. People rarely function exclusively as individuals, but as part of a shared socio-cultural system. Does the emphasis on neatly categorizing people obscure the larger system?

[After I posted an answer, I thought it seemed like as good a place as any to start beta-testing this WordPress software.  So here goes.]

That’s a pretty big question. First, any type of research or observation only reveals a part of the ecosystem…it’s the researcher’s job to relate the specific study result to the whole. But the question posits a more actively negative function–it’s a bit different to obscure reality than only to reveal a part of it.

Businesses absolutely obsess over statistical trees. Look at the way good measurables might point towards short-term stock performance, but not necessarily reflect a sound business plan and management team poised for long-term growth. In another field entirely, Bill Simmons had a column recently bemoaning the state of basketball statistics, wondering why so many intangibles remain uncaptured, despite the advances of baseball’s sabremetrics. Basketball skill, he contends, can be measured much better by a trained observer than by piles of statistics.

Or check out this Nation column decrying the use of evolutionary psychology and (gasp!) quantitative measurement in literary criticism: http://tinyurl.com/ll4e4z. Sample quote: “[The literary Darwinists’] goal is not only to reseat literary studies on a basis of evolutionary thinking but to found a “new humanities,” as the title of one book puts it, on scientific principles: empirical, quantitative, systematic, positivist, progressive.”

All of the increased computing power has put enormous quantities of information at our disposal, and with it, the expectation we can find the little bit of information we need when we need it. But that also leads to forcing square pegs into round holes. On a practical level, that might mean taking a Claritas Prizm segmentation and imposing it on your customer base. If you’re looking to do a direct mail campaign, maybe that’s okay. If you’re trying to gain new insight, probably not.

But where technology has confused us, I’m hopeful it provides an answer, or at least one answer. The field of social network theory and the software developed to monitor social media are both concerned with linkages and ecosystems and relationships. Just now are we starting to apply those models to more conventional data sets.

The other solution is much more low-tech–good research is part science, surely, but part art. The tension between the humanities and the social sciences is an interesting one. Art (and the humanities) are a harder sell, for sure. Just ask a high school senior. But I don’t think they’re less valuable. Does an overemphasis on science “obscure” one’s openness to art? Perhaps. But data doesn’t segment itself, even with SPSS. You need a person at the other end who can piece together the story, paint the whole picture. That can’t always be done with numbers alone. The trick, as with the Old Masters, is finding a good patron.

Comments (1)