A showdown between an ad tech firm and the FTC will test the limits of U.S. privacy law

September 08, 2022

 

Who gets to know where you are? The question is at the heart of a new lawsuit by the US Federal Trade Commission against a little-known Idaho-based ad tech company, Kochava, which regulators accuse of unfairly selling the timestamped location information of millions of Americans. But before government lawyers could file suit last week, the company sued the FTC, calling the agency’s structure unconstitutional and claiming the company broke no laws. The back-and-forth has set up a showdown that could shape the future of our location privacy.

 

A showdown between an ad tech firm and the FTC will test the limits of U.S. privacy law

 

Kochava helps marketers measure ad performance and runs a data exchange that sells location data to other ad tech companies. The FTC, which is responsible for protecting consumers, says the data, easily de-anonymized, could allow unscrupulous actors to track people’s mobile phones to places of worship, military bases, and other sensitive locations like “therapists’ offices, addiction recovery centers, medical facilities, and women’s reproductive health clinics.” The agency says that until recently it didn’t take much effort to obtain a data sample from Kochava containing more than 61 million unique mobile identifiers. Regulators say this data could put people at risk of “stigma, discrimination, physical violence, emotional distress, and other harms,” and ask that the company stop selling the data and delete what it still has.

The lawsuit, filed on August 29, is part of a broader push for stronger limits on personal data under FTC Commissioner Lina Khan. In July, the agency said it would crack down on efforts to stop the illegal sharing of health data in the wake of the Supreme Court’s overturning of Roe v. Wade and an executive order by President Biden to protect reproductive rights. Last month it began seeking public comment for a set of new rules on “commercial surveillance.” (??The agency will hold a public livestreamed forum on the issue today.) The rules could ultimately be superseded by a new bipartisan federal privacy bill, the American Data Privacy and Protection (ADPPA), if Congress were to pass it, but Khan has called the rulemaking an important step if that doesn’t happen.

“The growing digitization of our economy—coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used—means that potentially unlawful practices may be prevalent,” she said in a statement last month.

 

A successful case against Kochava could establish a legal basis for new privacy rules, but the suit has already sent a shot across the bow of the location data industry. Kochava is only one of hundreds of players in the data ecosystem, including phone carriers, tech companies, marketers and data brokers. Location data has been used to arrest and prosecute people—including for getting abortions—and to stalk people, reveal their sexual orientation, illegally surveil them, or even target them with drones.

Operating in a legal gray area and largely in secrecy, U.S. government agencies have also been buying up piles of Americans’ location data, mostly from the same data ecosystem where Kochava and other ad tech firms operate. An investigation by the Electronic Frontier Foundation, published last week, showed how a company called Fog Data Science is also selling this data to local police, allowing law enforcement in effect to circumvent constitutional protections against unlawful searches.

 

In its complaint against the FTC, Kochava denies any wrongdoing, but it also goes a step further, accusing the agency itself of violating the Constitution. In a challenge filed last month in U.S. District Court, Kochava says that, in the absence of a federal privacy law, the FTC is trying to enforce vague rules beyond its purview.

“The FTC’s hope was to get a small, bootstrapped company to agree to a settlement—with the effect of setting precedent across the ad tech industry and using that precedent to usurp the established process of Congress creating law,” a spokesperson for Kochava said in an email.

Notably, while the company denies any wrongdoing, it doesn’t substantially dispute the FTC’s basic facts about its sale of geodata. Indeed, in its marketing materials and its own legal filing, it largely corroborates its location tracking capabilities, like its lucrative  business estimating for advertisers if someone has seen a real-world billboard.

 

“Kochava isn’t particularly unique or egregious, but fairly representative of the ad tech industry,” says Serge Egelman, a researcher at UC Berkeley’s ??International Computer Science Institute and CTO of AppCensus.

What Kochava is doing with Americans’ data—and what it says it’s doing

How does this location data market work? Once you start using an app or website that contains trackers from one of Kochava’s partners—and once you agree to provide your location data, so you can use the site’s map function for instance—you might reasonably infer that the data is being used to provide that functionality. But in the data ecosystem where Kochava makes its money, there are few guarantees about what will actually happen with your personal information.

Ad tech firms design and sell software development tools, SDKs, that track and analyze data about an app’s users. If you’re an app maker, an ad tech company may pay you to use its SDK in order to get access to your users’ location and other data, which they can then sell. Kochava runs Kochava Collective, an exchange where it and other firms sell data about app and web users, which it calls the “largest independent data marketplace for connected devices.” The FTC said that as part of its premium license, Kochava was selling precise location data, and in some cases giving it away. (Kochava has an SDK of its own but emphasizes that its location data is sourced from unnamed “third party partners.”)

 

“While many consumers may happily offer their location data in exchange for real-time crowd-sourced advice on the fastest route home,” the FTC wrote in a blog post this year, “they likely think differently about having their thinly-disguised online identity associated with the frequency of their visits to a therapist or cancer doctor.”

Its arguments in its defense also echo longstanding privacy-washing claims by data brokers, says Egelman. “There are two lies that they continue to tell, which are that data is collected with consumer consent and that it is anonymous,” he says. “And the latter is contradicted by the company’s public-facing marketing materials.”

Still, in its lawsuit, Kochava argues that it’s done nothing wrong, because it contends the data is collected consensually, and because it’s “anonymized.” Since the company doesn’t identify the name of the consumer or of the location associated with the latitude and longitude, its lawsuit explains, “Kochava does not collect, then subsequently sell data compilation that allows one to track a specific individual to a specific location.”

 

But these are dubious arguments. While anonymization has a clear meaning in certain contexts like health data, the idea is more nebulous—and meaningless—in the world of commercial data. Security researchers have shown anonymized data to be a fallacy. Unless it’s encrypted, data that’s been scrambled can be deanonymized.

Location data, even without personal identities attached, is especially easy to deanonymize because it is often so specific that it can be easily used to identify you. In its suit, the FTC explains it was able to use Kochava’s data to trace a mobile device from a women’s reproductive health clinic to a single-family residence, likely making it possible to identify its owner.

For that reason, any given set of a person’s location data, even without identifiers and without the geo-coordinates of a doctors’ clinic or a therapists’ office, could be considered sensitive. Once it’s combined with other personal information, like data from health apps, location data, says the FTC, “creates a new frontier of potential harm to consumers.”

 

Additionally, Kochava argues that “even if an injury to the consumer did indeed occur,” that would be consensual, because users agreed to the “terms of service” of their partners’ apps. The company says it requires its partners to inform users that they collect and may sell personal information, geolocation information, and inferences drawn from that data.

“In other words, the consumer agreed to share its location data with an app developer,” says Kochava in its complaint. “As such, the consumer should reasonably expect that this data will contain the consumer’s locations, even locations which the consumer deems is sensitive.”

But this is hardly reasonable. Consider the privacy notice at the bottom of Kochava’s own website, which asks for permission to do things like “use precise geolocation data,” “actively scan device characteristics for identification.” “match and combine offline data sources,” and “link different devices”—all done, it says repeatedly and uselessly, “in support of one or more of purposes.” But it’s impossible to consent to those purposes when we don’t even know what they are.

 

Finally, the company says that while it does sell highly precise location data, it contends it did nothing illegal—because the FTC has not provided specific definitions for what unfair or deceptive means specifically when it comes to “sensitive” data.

“FTC still has yet to issue any rule or statement with legal force and effect describing the specific geolocation data practices it believes [the FTC Act] prohibits or permits,” the suit says.

Still, even as it contended it did nothing wrong, Kochava also said it was taking action to limit the use of certain location data. In recent weeks the company said it was implementing a “privacy protecting” feature that seeks to avoid connecting sensitive location data to other individual data, seemingly acknowledging how sensitive the data is.

 

What precedent says

The FTC has been taking a more assertive approach to privacy violations it considers unfair or deceptive. In 2021 it reached settlements with the period-tracking app Flo and the ad platform OpenX over violating user privacy; in April the agency reached a settlement with CafePress, the online T-shirt maker, over its poor data security practices. In March the FTC reached a settlement with weight loss company WW International (formerly Weight Watchers) that requires the company to pay a $1.5 million penalty, delete the personal data of children under the age of 13 that was allegedly obtained unlawfully, and delete any algorithms or models derived from that data. (Children’s personal information is protected by the Children’s Online Privacy Protection Act, or COPPA.) ??The FTC first demanded algorithm destruction in a January 2021 settlement with EverAlbum, which, NBC News reported in 2019, used users’ photos to develop its facial recognition software without their consent or knowledge.

Still, it’s not clear for instance if the FTC even has the authority to impose data privacy rules without any federal law that specifies that power. (COPPA does, but it applies only to data collected about children under 13.) The recent Supreme Court decision in West Virginia v. Environmental Protection Agency, which limited the government’s ability to regulate power plant CO2 emissions, held that regulators could not issue rules on “major questions” affecting “a significant portion of the American economy” without “clear congressional authorization.”

 

Companies sanctioned by the FTC have also successfully challenged its authority in recent years. In 2018, an appeals court held that the FTC’s order mandating an overhaul of data security at lab testing company LabMD was too broad to be enforceable. Last year’s Supreme Court ruling in AMG Capital Management vs FTC limited the agency’s ability to seek equitable monetary relief from Section 5 cases. Commissioner Rebecca Kelly Slaughter, a Republican appointee and the acting Chairwoman at the time, said the Court had “deprived the FTC of the strongest tool we had to help consumers,” in effect benefitting “scam artists and dishonest corporations.”

Kochava: The FTC is violating the constitution

The thrust of Kochava’s suit is that any enforcement actions by the FTC should first be reviewed by federal courts on constitutional questions prior to the resolution of the agency’s administrative procedures, which involve “years of protracted litigation” and injunctive relief, like ordering the deletion of data or prohibiting its sale, thereby causing “irreparable and significant harm.” In effect, the company’s lawyers contend that the FTC is violating the company’s right to due process.

Kochava is challenging the FTC’s enforcement authority on constitutional grounds. In its complaint, it cites a pending Supreme Court case, Axon Enterprise v. FTC, brought by the police supplier Axon, which is seeking to have lower courts review the company’s own questions about the FTC’s constitutionality before the agency’s enforcement against the company continues.

 

A win for Kochava could put a damper on location privacy and the future of the FTC’s enforcement actions. But it could also put more pressure on Congress to pass new federal regulations under the proposed American Data Privacy And Protection Act, which would give the FTC expanded regulatory powers.

The FTC faces an uphill battle, in part because it must establish actual harm or the likelihood of substantial harm, not just speculative injury, says Megan Gray, CEO of GrayMatters Law and Policy. “But an unsuccessful case could also push Congress to pass a privacy law,” she says.

Whatever the outcome, Kochava’s gambit and the FTC’s suit show that, despite efforts by companies and reporters and lawmakers—and even after you ask apps not to track you—an industry that promises to respect your privacy is still collecting as much data from us as they possibly can. And as they argue against the rule-making authority of the regulators, companies like Kochava show why some rules are so badly needed.

Fast Company , Read Full Story

(20)