Podcast: Play in new window | Download
Sima listens in today as Steve Danziger interviews Vignesh Krishnan and Alex Leger, and their discussion about how technology involves our interactions in data quality and respondents’ experience, and how the technology discussion shapes our agenda going forward.
Steve Danziger is Executive Director of Paradigm Innovation Labs, which gives us the marriage of technology and operations, and other avenues throughout the company.
Vignesh Krishnan is the CEO and Founder of SampleChain, LLC, a software company focusing on quality and fraud management issues and market research.
Alex Leger is the SVP of Product at Market Cube, which helps out on things like technology sample and the like.
Technological Advancements Speed
The speed of our industry’s ability to adapt to advancements in technology is a lot quicker than we give it credit for.
Vignesh notes that as recently as 6 years ago, he does not recall any companies doing API deliveries for his company. Today, over 98% are being delivered by API. He further observes that almost 100% of the companies with which he’s familiar is at least investigating machine learning in some form, at the least using a big database to manage high volumes of data.
Although not as quick to adapt as ag-tech or finance, Vignesh believes that the technology industry is its own worst critic, and is still ahead of other industries in terms of the speed of embracing advancements. The technology industry may not have the innovation or technology budgets of the ag-tech and finance industries, but being aware of these factors can go a long way.
The Rise of Big-Data, Online Mobile Survey Data
As the industry evolves, Vignesh stresses that we must be cognizant of the fact that we are literally sitting on a gold mine. Other large data companies might be able to tell you what happened up until this moment, whereas we can ask intent questions, such as ‘What is this group of respondents going to do tomorrow?’ or ‘What is their sentiment, going into the next month or quarter?’
We continue to make these investments and ensure that we’re gleaning all the data as it’s coming through.
Industry models have lowered the barrier of entry into this space. We’re seeing more players interact and source respondents for this type of data collection. Steve feels that this puts more pressure on us to be more diligent about the impact that these sources can have on the quality of survey data.
8 Billion Dollars and Growing
Out-of-home advertising is growing! It is the only traditional channel that is on the upswing, and there are a few reasons for that. People are moving around in urban environments in different ways than they have before, more people are living in big cities and taking mass transit, and they are being exposed to opportunities to the channel more than ever before.
Data Quality and Different Types of Fraudsters
There are basically two different types of fraudsters: those on the sample side, and those on the client-side, as Alex explains. They can be broadly grouped into unorganized individuals, who may provide fake answers to surveys that they’re taking, and organized humans, who may be operating out of other countries, behind VPN’s, or going deeper into the tech side of things so they can cheat.
There is a third category that is not human, but bots; a machine doing all of the cheating. As Sima points out, the people who develop these bots benefit from getting all of the inside notes for populating surveys.
From the client perspective, it’s just bad data that is produced in the system.
Mitigating Duplication & Location – A Look Back
Using CAPTCHA’S when building a survey program makes it very difficult for a bot to navigate, but it seems that the number, letters, and pictures are getting more difficult for even us humans to discern.
Browser-based validation checks have been an industry standard. A digital fingerprint is when a user accesses a website, and they’re sending across a lot of information. Alex explains how this is useable.
Future Industry Methods
New techniques that are being employed to mitigate fraud that builds on existing checks borrow ideas from the ag-tech and finance industries. A lot of those companies are now using behavior metrics, normative metrics, longitudinal metrics.
A lot of companies in our space, Vignesh finds, are using reconciliation rates and purge rates, making sure that none of that information is lost and is funneled back in to their algorithms, ensuring that all of the data sets that exist are taken into account, before, during and after the survey experience, and tying them all in together.
Where the Human Element Plays In
Humans are great at pattern-matching, and at finding distinctions. But, even as machine learning becomes better, we will still have to use our street intelligence.
Quick links to connect with Steve Danziger: