A new investigation revealed the gay dating app has been pawning off the locations its queer clientele via brokers for years.
Photo: Leon Neal |
It is an unfortunate truth universally acknowledged that an app on your phone must be in want of profit, and that its maker will often rely on ads to earn that money. It’s also true that these ads rely on rat-king-esque networks of partnerships to make that digital cash appear. At best, this mess means that billions of dollars vanish from companies’ ad budgets each year. At worst, you find out that the world’s most popular queer dating app was unwittingly passing off location data on its clientele for years, as users of Grindr discovered Monday via The Wall Street Journal.
Citing two people familiar with the matter, the Journal reported that the locations of countless Grindr users—which includes millions of gay, bi, and trans people across the world—were available for purchase since “at least 2017,” according to the report.
According to the Journal’s sources, one of the company’s old ad partners, MoPub (which was sold off by Twitter earlier this year), was freely passing off location data from the tens of thousands of apps that use place-based information to monetize. At one time, this included Grindr. Once in MoPub’s hands, the Journal alleges that this data was sold off, in bulk, to other partners, like Near (formerly known as UM, and formerly formerly known as UberMedia). And Near offered up that data to just about anyone. Because data privacy laws in the U.S. are vague and chaotic where they exist at all, Near can pawn off data from its upstream partners out in the open. You, dear reader, could buy it yourself.
“Grindr has shared less information with ad partners than any of the big tech platforms and most of our competitors, restricting the information we share to IP address, advertising ID, and the basic information necessary to support ad delivery,” Grindr spokesperson Patrick Lenihan noted in a public statement.
With all respect to Lenihan, that bar is extremely low. So-called “anonymous” data points like an ad ID or IP address can easily be tied back to a specific device, and the person who owns that device. By using “anonymous” data like this, advertisers can accurately surmise your workout routine, your favorite tunes, your immigration status and much, much more.
While offering location data to ad partners is an icky, albeit common, practice, the stakes with Grindr are particularly high; about one year ago, reports emerged that location data gleaned from the app was used to out a Catholic priest. The priest resigned, and Catholic news writers wrung their hands over the ill-gotten data source.
Grindr denied any wrongdoing at the time, and pointed out in a statement to Gizmodo that the company had closed off access to its user’s location data since 2020. But the Journal’s report and the laundry list of ad partners that Grindr has used to monetize over the years add to growing scrutiny facing the company.
Even the data used to out the priest was anonymized, legally speaking, but the middlemen were able to tie the Grindr-using device to a certain Grindr-using priest was because the device was seen frequenting the priest’s residence and lake house.
Did those data points come from Near? From MoPub? From some affiliated party? It’s literally impossible to say; ad networks are notoriously dense and opaque, even in states like California, which has the strongest data privacy law in the U.S. today. Again, it’s a pretty low bar. As a Near spokesperson told The Journal, “every single entity in the advertising ecosystem has access to the information shared by Grindr and every other app that uses the real-time bidding system.” That’s the norm in the adtech world.
Does the blame in this case lie with Grindr? Absolutely. But it also lies with a system that handles your anonymity without care. Right now, if you have enough cash, you can buy location data from cell towers, satellites, retailers and countless apps that might, inadvertently, surface someone’s sexuality. And until the LGBT+ community stops being seen as a juicy market for ad targeting, people will keep buying that data, and they’ll keep doing whatever they want with it, legally. And that means nobody, queer or otherwise, is safe.
Tags:
Privacy and Security