Today, digital identity is so much more than just your username and password. It combines data on who we are, our history, our interests and our preferences. Put all of these attributes together and a remarkably accurate and traceable picture appears of how we live, work and socialize. That identity data is one of the greatest assets any organization can have.
For example, a retailer, will have knowledge of its customers’ shopping basket, where they shop, their hobbies, family needs and more. A mobile provider will know where customers roam, who they call and who they connect with. And financial institutions will know who consumers transact with, their credit scores and profiles.
At the end of the day, we all leave a “digital exhaust” behind every movement, transaction and communication. And organizations want to utilize that. By understanding more about customers’ needs, preferences and behavior—and analyzing that data—they can target audiences more easily with compelling offers, deliver a more personalized service and, in the case of government, provide more efficient, joined-up public services.
But there’s a catch. Personal data may be the new currency in the digital market, but like any currency it has to be stable and it has to be trustworthy. And that trust is often compromised.
We are seeing more and more consumers’ personal data being subject to an unintentional amount of malicious loss and misuse. This can result in legal action, substantial fines, customer churn and damage to a brand’s reputation. Everyone is familiar with the tale of the young girl who shopped at Target, leading the company to discover that she was pregnant even before her family was aware. Followed by the story last year that, if a customer enabled voice command on Samsung’s Smart TV, the TV could in fact, listen to all conversations in its vicinity, capture that data and transmit it to a third party. Stories like these are the tip of a vast iceberg of data leakages that all had severe consequences for the brands involved.
Balancing insight against privacy is tricky
Society as a whole is increasingly struggling with what consumer privacy really entails. “The Future of Privacy,” a December 2014 report by the Pew Center for Internet and Society, found both discrepancies in participants’ definitions of privacy, and disagreement about whether such a thing as a privacy infrastructure would really exist by 2025. Some even argued that privacy would become a “luxury,” accessible only to those who could afford to pay more to keep their personal information private.
On the other hand, the definition of privacy is being questioned as technology far outpaces our ability to predict implications it creates. Social data generated on Facebook, Twitter, Instagram, Pinterest and YouTube, for example, is feeding the most extensive, complex and personal data sources in the digital universe.
There are social platforms that offer differing levels of control, from public platforms like WordPress, Tumblr and YikYak, to private messaging services such as Snapchat and Path. Sina Weibo, VKontakte, Line and others are structured differently in terms of features and privacy controls, based on local legal, political and cultural factors.
So what’s the point? There are no set rules for data privacy, and there likely never will be. However, brands need to look closely at how they use data strategically and assess what the likely impacts may be—not only on short-term revenue gain, but also on long-term trusted relationships. Maintaining that balance, in which both businesses and their customers thrive, should be the guiding principle of any brand.
Note: A similar version of this blog post appeared on DataSift’s blog at blog.datasift.com in Nov. 2015.
Digital & Social Articles on Business 2 Community(46)