Meta plans to remove thousands of sensitive ad-targeting categories.
SAN FRANCISCO — Meta, the social media company formerly known as Facebook, said on Tuesday that it planned to eliminate advertisers’ ability to target people with promotions based on their interactions with content related to health, race and ethnicity, political affiliation, religion, sexual orientation and thousands of other topics.
The move, which takes effect on Jan. 19, affects advertisers on Meta’s apps such as Facebook, Instagram and Messenger and the company’s audience network, which places ads in third-party apps. The Silicon Valley company said it was making the changes to limit the way that its targeting tools can be abused. In the past, these features have been used to discriminate against people or to spam them with unwanted messaging.
“We’ve heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups,” said Graham Mudd, a vice president of product marketing for Meta.
Meta relies on targeted advertising for the bulk of its $86 billion in annual revenue. The company has excelled at giving advertisers a place to personalize promotions, with brands often able to aim their ads at Facebook, Instagram and Messenger users who are interested in topics as specific as L.G.B.T.Q. culture or Catholicism. Such tailored ads often have a better chance of sparking a sale or prompting users to join a particular Facebook group or support an online organization than more generalized ads.
But Meta has also faced a litany of complaints about advertisers abusing these targeting abilities.
Before the Jan. 6 storming of the U.S. Capitol, for example, advertisers used targeting tools to direct promotions for body armor, gun holsters and rifle enhancements at far-right militia groups on Facebook. In 2020, auditors concluded that Facebook had not done enough to protect people who use its service from discriminatory posts and ads.
In 2019, the Department of Housing and Urban Development sued Facebook for allowing landlords and home sellers to unfairly restrict who could see ads for their properties on the platform based on characteristics like race, religion and national origin. And in 2017, ProPublica found that Facebook’s algorithms had generated ad categories for users interested in topics such as “Jew hater” and “how to burn jews.”
In response to the abuse, the social network has tweaked its ad-targeting tools over time. In 2018, it removed 5,000 ad-targeting classifications to keep advertisers from excluding certain users. Facebook also disabled the anti-Semitic ad categories after the ProPublica report.
But Meta’s latest changes may be unpopular with the millions of organizations that rely on the company’s tools to expand their audiences and build their businesses. Advertising on Facebook, Instagram and Messenger that is finely tuned to people’s interests is often more affordable and effective than advertising on broadcast television and other media.
Those organizations include political groups and advocacy groups, many of which rely on the platform for fund-raising. Last year, political campaigns and nongovernmental organizations criticized Facebook when it temporarily removed political advertising from its sites around the presidential election; the restriction was lifted in March. Some campaigns said the move had benefited incumbents and larger organizations that didn’t count on small donations through Facebook.
Republicans and Democrats blasted Meta’s changes on Tuesday. Reid Vineis, a vice president of Majority Strategies, a digital ad-buying firm that works with Republicans, said in an emailed statement that the social network had gone from being “the gold standard for political advertising” to throwing roadblocks between campaigns and voters.
“This decision is harmful to nonprofit and public affairs advertisers across the board and will result in fewer charitable donations, limited public debate and a less informed public,” he said.
Mr. Mudd said that the new policies would be unpopular with some, but that the company had decided that moving forward was the best course.
Understand the Facebook Papers
A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.
“Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions,” he said. He added that some of the ad changes had been under discussion since 2016.
Augustine Fou, an independent ad fraud researcher, said advertising on Facebook and its other apps had long worked “better than any other display ads elsewhere because Facebook has years of people volunteering information, and it’s pretty accurate.” He added that personalized advertising outside the platform often relied on guesswork that was “so wildly inaccurate that when you try to target based on that, you’re worse off than trying to spray and pray.”
Yet Meta has often struggled with how to take advantage of consumer data without abusing it.
“Of course, Facebook can deduce that you’re gay, or that you’re African American, but then the question becomes whether it is ethical to use those categories for targeting,” Mr. Fou said.
The new changes do not mean Meta is getting out of ad targeting. The company will still allow it for tens of thousands of other categories, which some critics said advertisers could use to achieve targeting similar to what the removed topics gave them. Meta added that it would continue to use tools such as location targeting.
The company also said it would let users, who can already limit their exposure to ads about topics such as politics and alcohol, start blocking promotions related to gambling and weight loss early next year.
“We continue to believe strongly in personalized advertising, and frankly personalized experiences overall are core to who we are and what we do,” Mr. Mudd said.