Security
Headlines
HeadlinesLatestCVEs

Headline

The Sad Truth of the FTC's Location Data Privacy Settlement

The FTC forced a data broker to stop selling “sensitive location data.” But most companies can avoid such scrutiny by doing the bare minimum, exposing the lack of protections Americans truly have.

Wired
#intel#auth#ssl

The US Federal Trade Commission (FTC) reached a settlement last week with an American data broker known to sell location data gathered from hundreds of phone apps to the US government, among others. According to the agency, the company ignored in some cases the requests of consumers not to do so, and more broadly failed to ensure that users were notified of how their harvested data would be used.

News that the settlement requires the company, formerly known as X-Mode, to stop selling people’s “sensitive location data” was met with praise from politicians calling the outcome “historic” and reporters who deemed the settlement a “landmark” win for the American consumer. This “major privacy win,” as one outlet put it, will further require the company, rebranded as Outlogic after its activities were exposed, to delete all of the data it has illicitly gathered so far.

Outlogic, for its part, offered a drastically different take, denying any wrongdoing and vowing that the FTC order would “not require any significant changes” to its practices or products. While the company is potentially downplaying the cost to its business, it is certainly true that any ripples from the settlement will be imperceptible to consumers and Outlogic’s industry at large—one which profits by selling Americans’ secrets to spy agencies, police, and the US military, helping the government to dodge the supervision of the courts and all its pesky warrant requirements.

The FTC’s crackdown on X-Mode’s activities may indeed be historic, but from a consumer standpoint, it’s for all the wrong reasons. First, it’s important to understand that the order concerns what the FTC is calling “sensitive location data,” a term of art impressively deluding and redundant at the same time. Any data that exhaustively chronicles a person’s physical presence—every moment of every day—is inherently sensitive.

There is no question that persistently tracking people’s whereabouts reveals political, religious, and even sexual associations. The act of collecting this data is a sweeping form of surveillance no matter the target. While it is easier, perhaps, to imagine how guests of “medical and reproductive health clinics, places of religious worship and domestic abuse shelters” are especially vulnerable to commercial forms of stalking, there are myriad ways in which people’s whereabouts, once exposed, can endanger or ruin their lives.

Location data is inherently sensitive—so says society, an overwhelming consensus of privacy experts, and the highest court in the land.

One need only look to Congress to understand the level of fear that this precise form of surveillance inspires in those who’ve never been battered, stalked, or unhoused. Members of the House Intelligence Committee—most of whom lack an internal reproductive system—are vying at this very moment to shield federal lawmakers alone from this invasive and omnipresent tracking.

Given the current political climate, it’s not hard to imagine why politicians are afraid of surrendering their location data, leaving it accessible to virtually anyone on the cheap. But they are relatively few in number, and hardly any of them fall into the category of “most at-risk” for violence or discrimination. Unlike those who do, members of Congress have the unique power to change the law and protect themselves. Given the opportunity, that’s precisely what many have opted to do—just as they did a year earlier for federal judges.

Protection against persistent tracking is something we can definitively say is being sought after by a privileged few. But citing law enforcement needs and nebulous national security concerns, protecting anyone other than themselves is a matter that apparently demands more discussion, more caution, and more delay.

If America is creating a new protected class to shield federal lawmakers alone from surveillance, then it must accept that, unlike the vast majority of citizens who’ve never been accused of a crime, this class would be historically replete with criminals. US lawmakers have faced charges and convictions for bribery, perjury, and felony theft, not to mention obstruction of justice, withholding evidence of secret arms sales, and the possession of child sexual abuse material.

The FTC’s victory in the Outlogic case is even more piddling in light of the commission today being led by one of the most vocal privacy protection advocates it has ever had, chairperson Lina Khan. That the company is still in business and touting how little the settlement matters underscores how truly underpowered and ill-equipped the FTC is for the job it’s been asked to perform. It is not actually armed to protect Americans from being relentlessly surveilled, nor is that even technically its mission. Under the current regime, the agency is permitted at best to hold surveillants to a standard of “notice and choice,” one both illusory and enabling corporations (and the US government by proxy) to right now track Americans’ whereabouts at any time, all of the time, with no real legal concern.

The FTC’s principle regulatory weapon in privacy cases is known as Section Five, and it largely targets companies in privacy cases for lying. The case against Outlogic, for example, is based in part on its failure to disclose how people’s location data would ultimately be used. It did not, the FTC says, notify users that their data might be sold to government agencies for “national security purposes.” Here, the FTC’s job is to ensure consumers receive “notice” of how their data is being used. But the job is a sham, and everyone who’s ever clicked “I agree” on a privacy policy knows it.

Before the Outlogic case, only a small number of Americans even knew the company existed. An even smaller number had ever glanced at its terms of service, and even fewer—we’re now approaching zero—could be said to have actually read and understood what it said. Commissioners at the agency have long acknowledged as much.

“It is no wonder,” Commissioner Rebecca Slaughter said in 2019, that “98 percent” of people in one study clicked “I agree” to privacy policies that “disclosed sharing with the NSA and paying for the service by signing away your first-born child.”

The same study, out of York University, labeled the “notice” aspect of privacy regulation “the biggest lie on the internet,” detailing how research showed over 75 percent of users opted to skip reading privacy policies altogether. Those who didn’t, on average, spent roughly a minute skimming policies that would’ve taken an ordinary person 15-17 minutes to actually read, let alone comprehend.

Famously, a decade ago, researchers out of Carnegie Mellon determined that it would take an average person roughly 76 working days to consume all of the privacy policies they encounter in a year. The use of “gotcha clauses” has also been repeatedly used to demonstrate that people do not read the terms of service. To wit, in July 2010, it was reported that more than 7,500 people had unknowingly sold their souls to a video game company.

These facts demonstrate unequivocally that the crime for which Outlogic is accused—failing to give consumers notice of how their data would be used—is not one of greed or opportunity, but sheer stupidity. The law is all but performative, so companies have nothing to lose by complying. Users can be counted on to agree to basically anything (up to and including being enslaved for all eternity).

People cannot “consent” to something they do not understand, just as consent cannot really be called “consent” when it involves a metaphorical gun to the head.

The truth is that even if users were strapped to a chair and forced to spend the 600 hours a year it would take to devour all of the purposely dense legal terms they agree to, many of those agreements are patently coerced, as the Supreme Court has said. So ubiquitous now is the technology tracking our movements, in “no meaningful sense” are people voluntarily accepting the risks of allowing companies to access and share that information. It is a matter of participating in society or not; of being employed or not.

It is also the FTC’s job to protect consumers from harm. But the definition of “harm”—like “notify” and “consent”—is watered down to the point of creating a serious gap between what it means and how most people actually use it. Companies are effectively free to harm people anyway, so long as it’s deemed “reasonably” avoidable by the agency and beneficial to competition on the whole.

Should the FTC fail to act, consumers have little recourse but to sit and suffer. It is nearly impossible for an individual to obtain standing in federal court to sue a corporation after their privacy is violated, assuming they can even afford to try. Merely exposing personal information is not enough. A test created by two recent Supreme Court rulings requires victims to demonstrate “concrete harm” to bring a case, even if tying any one company to an act of identity theft, harassment, or financial loss is, in nearly every instance, a fantastical objective.

Congress’s last big push to pass a comprehensive privacy bill—the American Data Privacy and Protection Act of 2022—ironically would have exempted Outlogic’s activities altogether. Fearful of fighting a war on two fronts, its authors choose to purposefully avoid going after companies contracting with intelligence and law enforcement agencies. But there is one bill floating in Congress right now aimed at ending the government’s ongoing practice of buying location data from Outlogic-like companies and circumventing the courts.

The Fourth Amendment Is Not For Sale Act was introduced and passed by the House Judiciary Committee last month as part of a bigger package aiming to reauthorize and reform the problematic US intelligence program known as Section 702. The bill is slated to be taken up again this year, likely in February, sparking one of the fiercest fights over US privacy law Americans have seen in years.

Outlawing a pervasive practice that helped transform domestic surveillance into a windfall industry is a move that would be actually historic—for all the right reasons.

Wired: Latest News

Russia’s Ballistic Missile Attack on Ukraine Is an Alarming First