New Rules, Tools For Managing Information In The Digital Age, Part III
Whether driven by Facebook itself or government, clamp-down regulation isn’t the answer, as it tends to have negative unintended consequences.
For example, in a well-intended effort to protect privacy, Facebook has now restricted companies from targeting ads to users based on medical conditions and personal characteristics. Sounds good at first, doesn’t it? After all, one’s medical condition is deeply personal. Except this now means that people with assorted medical conditions no longer get the benefit of treatment information they want and need.
The solution to all this is to support, not curtail, the expansion and use of social media for all its positive benefits — and to put to put in place tools, rules, and aggressive efforts to motivate and educate users to proactively manage the information they share and how it is used.
Five pillars form the solution: Transparency, Line-item Opt-in, Always Optional Opt-out, Accountability, and Education. The industry — in particular Facebook and Google — is taking steps on these. But they, the government, and users all need to do more.
1) Transparency
Transparency is key to managing the velocity and range of data coming at us, such that we are not subjected to the false information, manipulation, and other abuses of the past few years. Or at least we have the context with which to manage and mitigate it. Transparency means knowing from whom and where the information we see is coming and what other information that source is presenting to whom.
The platform companies need to empower users to know what they are looking at, create a structural context to evaluate it better, and then hold bad actors accountable. Platforms should be designed so that any everyday user can drill down and across to figure out what is going on and make critical judgements.
Every advertiser and publisher, of any size, on a platform of any size, needs to prove who they are, with a government ID and a bank account or credit card information. Each has to have a registered page on the platform. This is all so we can really tell who is behind these ads and what they are really up to.
Every ad and every piece of published content has to have an obvious icon and link back to the registered page of its producer. This way a user can also see what else a given advertiser is saying to other people and the history behind it.
2) Line-item Opt-in
Social networks have the right to require your data for you to use their services. The network just has to be clear about it. The users have to have the right to walk away with their data at any time. Under the new European Union GDPR (Global Data Protection Rules), providers have to give users an explicit point of opt-in before collecting their information.
This is the right idea, but given the depth of what can be collected and the user propensity to skip reading the terms of service, we need more than that. There should be an explicit Line-item Opt-in. For example, a box should be required for each of name, email address, age, gender, race, religion, medical conditions, interests, conversations, and so on. This way, users consciously decide what they are willing to turn over.
3) Always Optional Opt-out
GDPR also requires a service provider to offer an Opt-out, sometimes referred to as delete everything you have on me, or a right to be forgotten. But again, just having this is not explicit enough. People simply forget what they have signed on for, and as such, there should be not just a mechanism to opt out, but one that is explicitly always available. To ensure this, the rules should be that the service provider has to provide an annual opt-in renewal to users, unless the user explicitly agrees to automatic renewal and gets an annual reminder that they have done so.
4) Accountability
Here is where policies, and government come into play. The social networks must create the tools and set up the policies as we have described. They must also empower their users to help monitor for abuse and provide them with ways to easily report it. Then the social networks have to be accountable for reviewing such reports and escalating illegal activity to government agencies.
We don’t want the government to specifically regulate what a company can collect or how long it can keep it, or whom it can target. This leads to negative unintended consequences. However, the government should require the social networks to have the proper transparency, tools, and policies, including monitoring and reporting illegal actions by third parties and users.
Ultimately the most important and most critical accountability need is for us — all of us who use these services — to go beyond just passively receiving information and believing it, to leveraging the power of these platforms to better review, question, understand, and make judgements about the information. This applies to both the information we receive and the information we hand over about ourselves.
5) Education
We need to educate ourselves, especially our children, with the knowledge and skills for this new world. it has to be considered as important as teaching our children basic social skills. Keep in mind that they live in this world the moment you let them touch an iPhone or iPad. In schools, this means required curriculum for critical thinking, personal information management, and high-velocity content review and navigation. The social networks must help by creating education and training materials.
Every action by the social networks, and every recommendation we are making, will go nowhere if as a society we don’t also hold ourselves accountable for stepping up with education, motivation, and fundamental acceptance of responsibility for the experience we have.
Yes — in the digital age, privacy is dead and has been for many years. But we can move beyond that to better lives by ushering in an era of proactive and responsible management of the personal information we give and the non-stop information we receive.
(30)