from the another-day,-another-problem-we’ll-do-nothing-about dept
We’ve noted a few times now that while Facebook gets most of the heat for its privacy scandals, the stuff going on in the telecom, app, and adtech markets in regards to location data makes many of Facebook’s privacy issues seem like a grade school picnic.
That was well highlighted by the recent Securus, LocationSmart, and numerous T-Mobile scandals which showcased perfectly how cellular carriers, app makers, and location data brokers routinely buy and sell your daily movement records with only a fleeting effort to ensure all of the subsequent buyers and sellers of that data adhere to basic privacy and security standards.
The end result has been just an absolute parade of scandals, but little more than a few pinky swears by impacted companies, wrist slaps by regulators, and apathy by Congress. As a result it just keeps happening over, and over, and over again.
The latest case in point: the Wall Street Journal has discovered that the Grindr dating app has also been collecting highly detailed user location data and selling it to a wide variety of middlemen since around 2017 or so:
The commercial availability of the personal information, which hasn’t been previously reported, illustrates the thriving market for at-times intimate details about users that can be harvested from mobile devices. A U.S. Catholic official last year was outed as a Grindr user in a high-profile incident that involved analysis of similar data.
The data in question was made available through the online ad network MoPub (previously owned by Twitter) and then sold through its partner company UberMedia (recently renamed UM). Researchers had been warning about this problem for a while, and were largely ignored.
As usual, Grindr executives claim this wasn’t a big deal because the data was “anonymized” and didn’t include personal names. But as we’ve noted countless times “anonymization” doesn’t actually mean anything, given you can identify these users with just a few additional datasets. The data was also “detailed enough to infer things like romantic encounters between specific users based on their device’s proximity to one another,” the Journal notes.
Grindr states that the company cut off sales of this data two years ago, but, as usual with this kind of stuff, there’s no independent way to test or confirm that claim without the aid of whistleblowers and competent and capable privacy regulators. An insider tells the Journal that Grindr didn’t do anything about this until 2020 because it didn’t see the harm (and it continues to downplay the harm in the story).
Last year, a Catholic news outlet called the Pillar obtained the Grindr location data of Monsignor Jeffrey Burrill, executive officer of the United States Conference of Catholic Bishops, forcing him to resign due to “serial sexual misconduct.” The company also came under fire for sharing some users’ HIV status with certain companies. The app also suffered from several vulnerabilities that exposed user data.
Grindr’s problems due to rampant over collection and sale of location (and other data) were bad enough that the report notes the app was used as an example in a presentation to multiple government agencies about the intelligence risks posed by over-abundant data collection and sale:
National-security officials have also indicated concern about the issue: The Grindr data were used as part of a demonstration for various U.S. government agencies about the intelligence risks from commercially available information, according to a person who was involved in the presentation.
Grindr’s Chinese owner Beijing Kunlun was forced to sell the app in 2020 due to national security concerns. But while DC loves to superficially hyperventilate about Chinese-owned companies and data collection specifically (see: the whole TikTok fracas), rampant data collection remains a problem with American-owned companies too, in part because that data still winds up widely available.
The reality is that the wild west approach to data collection and monetization causes an incalculable level of potential harm. Yet we don’t meaningfully address it because the sale of such data is simply too profitable for too many different industries (marketing, telecom, healthcare, insurance, banks, app makers), all simultaneously lobbying Congress to do either nothing, or the wrong thing.
As a result, we keep stumbling through the same stories week after week as if stuck in a bizarre Groundhog-Day-esque purgatory, with the key difference being that nobody in the U.S. seems to be learning anything from the experience.
Filed Under: anonymized, china, consumers, dating apps, location data, national security, privacy, tracking