Seen any personalized ads lately?
Of course you have.
Geoffrey James explains:
“Unless you’ve been living under a rock on Mars, you know that your activity on (and off) the web is constantly monitored and evaluated so that applications and businesses can create a ‘personalized experience’ for you.
“That’s why you keep seeing online ads that match whatever you might have searched for in the past few days.”
How do consumers feel about that?
Well, they’re conflicted– to put it mildly.
On the one hand, a lot of people find personalized ads a bit “creepy.” And more than a few of them tell their friends about those creepy experiences:
“A new study by customer experience (CX) analytics firm InMoment found that at least 75 percent of consumers surveyed think most forms of ad personalization are at least ‘somewhat creepy.’
“And consumers don’t keep this information to themselves: one in five respondents tells friends about marketing experiences that they consider creepy, and one in 10 shares ‘Big Brother-type experiences’ on social media…”
And then there’s the “ethics factor”:
According to a survey conducted by cybersecurity firm RSA, most people (83% of respondents) don’t think it’s ethical for companies to use their personal information for targeted ads.
Another finding from the RSA survey: when a company gets hacked, consumers blame the company, not the hacker.
So that must mean consumers don’t like personalized ads, right?
Wrong.
A study by digital marketing performance agency Adlucent found that 71% of respondents prefer ads that are tailored to their personalized interests and shopping habits.
They see it as a way to discover new products, while reducing the number of irrelevant ads they’re subjected to.
Another finding from the same study:
“Personalized ads also boost engagement. We found that people were almost twice as likely to click through an ad featuring an unknown brand if the ad was tailored to their preferences.”
The conflict about personalization comes down to this: people appreciate the convenience of targeted ads, but they don’t want personalized services at the expense of their privacy.
The key to solving this dilemma for brands: transparency. Telling consumers yes, we’re tracking you, here’s the personal information we’re using, and here’s where we get it.
Harvard Business Review study found that when brands use highly sensitive data to track customers (sexual orientation, political beliefs, certain search topics), it makes people uncomfortable– like they’re being spied on.
However, they’re more willing to tolerate the use of general information: things like their name and shopping history.
The HBR study offered 5 suggestions for digital marketers who want to succeed with targeted ads:
1. Stay away from sensitive information (race, sexual orientation, medical conditions)
2. Commit to transparency (at least be willing to provide information about data-use practices upon request)
3. Exercise restraint (you lose customers when ads feel intrusive or inappropriate)
4. Explain why you collect data (“it will help us generate more appropriate and useful ads”)
5. Use traditional data collection, too (give people a chance to directly state their preferences)
Bottom line: to maintain trust, brands have to respect the customer and the customer’s privacy.
Harvard Business Review puts it this way:
“There’s still a lot we don’t know about how people respond to online data collection and ad targeting, and norms around privacy may change over time as young digital natives become consumers and technology further penetrates our lives….
“In the end, all ad targeting should be customer-centric—in the service of creating value for consumers.”
Illustration by Mark Armstrong.
Originally published on Mark Armstrong Illustration.
A version of this article originally appeared here.
Digital & Social Articles on Business 2 Community
(37)