Maintaining Quality in Online Data
Commentary by Adam Weinstein, co-CEO, Full Circle Research Co., with collaborator John Woelfel, Ph.D., President, Woelfel Research
A cornerstone component of market research is high quality data. When asked what factors were important to choosing a new research methodology, 95% of those surveyed in the 2017 Quirk’s Salary Survey & Corporate Study said quality data was “extremely important” (70%) or “very important” (25%). According to this same study, online research is meeting this expectation. More than nine out of ten respondents (93%) say that online surveys are “very effective” (18%) or “effective” (75%) in “providing quality data and insights.”
This is good news for our entire industry, but it does not mean that we can rest on our laurels. Understanding the components of quality data, as well as strategies to ensure it, help continue this positive trajectory.
High Quality Online Data: Components
Respondents to the 2017 Quirk’s Salary Survey & Corporate Study were asked, “How do you define poor quality data?” The results can be used to identify some of the main components of high quality data. In looking through the comments, three areas stood out as impacting survey quality: (1) sample composition; (2) survey responses; and (3) questionnaire content.
- Sample Composition
When describing contributors to bad quality, respondents said things such as, “…small number of actual target market responses,” “…sample is highly skewed toward one end of the demographic spectrum” and “…unrepresentative sample, lack of adequate sample size.” Eliminating these issues boils down to selecting a large enough sample of the target audience for the study. Another challenge raised was duplicate respondents. One simple answer here is employing the right verification tool.
- Survey Responses
The respondents consistently point out five aspects of survey responses which compromise data quality:
- Inconsistent responses
- Speeding through the survey
- Poor quality open-end responses
- High levels of missing data
- Straight-line responses
The reality is that the responsibility to deliver quality data falls on more than one role. Programming houses, validation/verification providers, data collectors and research clients are active participants in the pursuit of quality responses and must work together to achieve positive outcomes. This holistic approach covers everything from advanced digital fingerprinting to limiting survey length and more.
Clearly, poorly crafted questionnaires will yield lower quality data. Similarly, poorly programmed questionnaires may also impact data quality. Many articles by reputable full-service agencies and programming houses address this exact subject and we recommend seeking them out for sound guidance.
High Quality Online Data: Strategies
Before outlining a variety of proven strategies to increase and protect data quality, we feel it is pertinent to discern how we differentiate between “quality” and “fraud” in regard to online research.
“Quality” refers to the overall throw-out rate (ideal is a low percentage); “fraud” refers to the suspect data within that which is thrown out (a low percentage is ideal here, as well). Respondents who speed through surveys, straight-line, deliver poor open-end answers, etc., are responsible for this suspect data.
As sample providers, we can employ a variety of proven strategies to lower the incidence of fraud (thereby increasing quality)—even below accepted levels. Some worth considering are listed here.
- Evolved Sampling Strategy
A provider’s sampling technique is as vital as fraud checks when it comes to collecting consistent, balanced data. Our recommendation is to employ stratification across your sample to mimic census percentages. This can be done regardless of a respondents’ platform, device, browser or even pixel size and should be implemented prior to survey start.Stratification ensures that if, for example, the US is comprised of 51% females, the data set won’t reflect 70%. Note that proper sample management techniques require consistency in sampling. The result is data that is not skewed, a true aggregate view of the market/audience being targeted.
- Data Quality Scoring System
Engagement levels greatly affect the percentage of suspect data collected. Respondents who have “checked out” and/or are providing dummy answers only to earn incentives cost real money on the back end. Scoring respondents prior to survey start can help lessen the likelihood of fraud.This type of engagement check can automatically remove potentially low-quality respondents in real-time, which ensures that they are attentive and providing quality opinions. It can also verify that each respondent’s profile and demographic information is up-to-date, which protects the census balancing mentioned above.
- Holistic Security
Today’s technology is advancing even as it is being implemented. Concurrently, even as solutions are being employed, security is being challenged by those wanting to break in. To successfully maintain data quality, a shared strategy is a worthwhile investment. Market researchers can and should invest in high-quality IT defenses, consistently testing and updating their systems. Clients can and should review data results in real-time to catch suspect activity. Validation/verification companies can and should adjust their platforms to protect against any noticeable breaches. Staying in front of security concerns is the only way to ensure that researchers stay in control of the data being collected.
- Consultative Relationships
A consultative business model, versus a traditional customer service one, can be a real quality asset. These partnerships allow sample providers to function as extensions of their clients’ teams. Data collectors intimate with their clients’ internal processes, pitfalls, RFP win/loss rates and budgetary needs offer concrete value. Providers able to anticipate challenges and understand feasibility help everyone win more business.That said, a consultative business approach hinges upon the business acumen of the providers involved. When asked, “Which areas of marketing research are you most frustrated with and why?”, respondents from the 2017 Quirk’s Salary Survey & Corporate Study said, “…market researchers usually lack business acumen to make meaningful recommendations,” “…hiring sales people who don’t understand data quality, sampling or weighting” and “Vendors who are not…able to translate results to address business questions.” Partnering with experienced researchers who also possess business intelligence is the difference here.
The points listed above were meant to incite more dialogue about quality in the online space. To continue the conversation, please contact Adam Weinstein at AdamW@iLoveFullCircle.com or 301-762-1972.