617.826.9056 (US) Info@HampsteadSolutions.com

In a recent interview, I was asked if I’d ever witnessed a situation where customer data was used in an invasive manner.

My response was bigger than the answer I provided.

We all hear about Big Data. Data is collected from digital interactions and increasingly, with the Internet of Things (IoT), from real life interactions as well. Throw artificial intelligence (AI) into the mix, and suddenly we could be collections of open secrets.

Yes, the variety and velocity of data have changed. However, highly personal information has been available for decades.

 

Not Big but Still Effective Data

There was a time when companies could purchase the data provided on a mortgage application. Just think about the scope of what is disclosed: income including source(s), net worth, assets, debt, family members, marital status, on and on.

Some of you may remember warranty cards that surveyed your hobbies. Did you wonder what registering your new toaster had to do with an interest in motorsports? Data! Responses from those cards were collected, aggregated, and sold to marketers.

Marketers turned that data into intelligence that extended to a targeted message. So, for example, if a marketer knew that someone’s income was above a certain level and that they had an interest in motorcycles, they would market a Ducati.

Big data isn’t necessary for poor, unethical decisions to drive invasive messaging.

The point is that sensitive consumer data has been available for decades. Admittedly, it came more slowly, and was less big.

 

Same Intrusion: Data Sized S-XL

 

With the long availability of data, so came the opportunity for abuse and intrusion.

In my interview with Matthew Fox, I gave an example of a national retailer that developed a model to predict pregnancy. There is great revenue potential at this family milestone and the companies that message first typically capture a significant proportion of that revenue.

 

The retailer intended to get a jump on marketing maternity items before women disclosed their condition either in a gift registry or by purchasing in that category.

The retailer’s marketing included teenagers which is how their practice made the news: they inadvertently forced the disclosure of a teenager’s condition to her distraught parents.

The retailer developed a successful model. However, their failures overwhelmed that success.

They failed to consider whether to market maternity products to a woman if she has not disclosed her condition. They failed to consider the ethics of marketing maternity to teenagers. They failed to consider all the scenarios that could make the condition sensitive—or even that the woman should choose who knows.

They failed to consider that some things are simply private.

They put their first bite at the revenue apple before all ethical considerations.

However, the retailer’s predictive model was just a new technique applied to an old practice: physicians and hospitals would routinely sell “New Mothers” lists. The retailer was just trying to leapfrog physicians’ disclosure of their unsuspecting patient’s condition.

Intrusion occurred in the retailer example, and with the “New Mothers” lists in the years preceding it.

Big data wasn’t necessary for poor, unethical decisions to drive intrusive messaging.

 

The Call is Coming from Inside the House

Historically, descriptive data (income, profession, interests, etc.) was collected. What has opened the big data floodgates is behavioral data. What we interact with, for how long, in what sequence is captured digitally and increasingly in real life.

Beyond size, and variety, and velocity, there is another major difference from the data generated decades ago: we actively contribute now.

Consider all the big data sources that we willingly bring into our lives: social media (Facebook, LinkedIn,) intelligent assistants (Alexa, Siri, Cortana), smart home technology (Nest thermostat, Ring doorbell, Numi toilet), wearables (Apple watch, Fitbit). The list only gets bigger and broader.

We rely on professionals with access to our data to reflect that they know us—but without making us feel like they’re peeking through the curtains.

We purchase those products or socialize on those sites and enlarge our data contribution.

Social media platforms and every one of those products is intentionally built to capture and forward data specific to each of us, and then aggregate it into a whopping big data set.

That data can be turned into intelligence minute or sweeping about individuals, groups, and/or the whole.

Where descriptive data let us hide behind facades (urban professional, new parent, snowbird, etc.), behavioral data is often more private because it describes our individual choices and what we do.

A benign example: I called my cable provider to discuss my bill. The rep off-handedly suggested, “Why don’t you cancel the premium channels? You never watch them.”

That simple, non-scandalous example felt extremely intrusive. I envisioned them watching me watch tv—because from a data perspective they do.

And so do any number of other devices that promise to get smarter by tracking and recording what we do.

It’s worth remembering that the next time you clear your search history, it’s still on your permanent record.

 

It’s a Judgement Call

Whether stored in floppy discs, or a zip drive, a mainframe, or the cloud, data has been available for decades. Ideally, that data is used to generate customer intelligence and then apply it to create relevant, helpful messages.

For decades, and perhaps less visibly, we have relied on the ethical judgment and common decency of professionals who handle our data. We have relied on those professionals to reflect that they know us—but without making us feel like they’re peeking through the curtains.

Although it is difficult to quantify, I am certain that intrusive messages drive up customer attrition and drive down revenue. In the earlier examples, rather than create first-mover opportunity, there must have been a legion of families who felt that companies violated their relationships for the sake of quick cash.

Whether data is applied as the rich foundation to improve customer experience or misused to create intrusion is reliant on the ethical compass of the professionals involved.

However, we as consumers can apply our judgment also. By selecting the devices and platforms that we allow to track and record our activities, we can limit unwanted intrusion. By taking a close look at our settings, we can refuse to share what we ordinarily wouldn’t.

Unlike the data that we unwittingly contributed historically, now we can throttle what we contribute to the larger set of big data.

 

 

Would you like to discuss customer acquisition, growth, or retention challenges? Or is data managing you? Let’s talk! Set up a 30-minute phone conversation with Marina.

Matthew Fox strives to help entrepreneurs and small companies simplify their messaging. Whether it is written copy or a full-blown presentation Matthew’s expertise provides practiced strategy, marketing, and project management services. You can find him through Point A to Point B Transitions or on stage at locations across the Midwest.

 

Photo credit: Craig Whitehead.