Meta Agrees to Alter Ad Technology in Settlement With U.S.

SAN FRANCISCO — Meta on Tuesday agreed to alter its ad technology and pay a penalty of $115,054, in a settlement with the Justice Department over claims that the company’s ad systems had discriminated against Facebook users by restricting who was able to see housing ads on the platform based on their race, gender and ZIP code.

Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-assisted method that aims to regularly check whether those who are targeted and eligible to receive housing ads are, in fact, seeing those ads. The new method, which is referred to as a “variance reduction system,” relies on machine learning to ensure that advertisers are delivering ads related to housing to specific protected classes of people.

“Meta will — for the first time — change its ad delivery system to address algorithmic discrimination,” Damian Williams, a US attorney for the Southern District of New York, said in a statement. “But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”

Facebook, which became a business colossus by collecting its users’ data and letting advertisers target ads based on the characteristics of an audience, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems have allowed marketers to choose who saw their ads by using thousands of different characteristics, which also have let those advertisers exclude people who fall under a number of protected categories, such as race, gender and age.

The Justice Department filed both its suit and the settlement against Meta on Tuesday. In its suit, the agency said it had concluded that “Facebook could achieve its interests in maximizing its revenue and providing relevant ads to users through less discriminatory means.”

While the settlement pertains specifically to housing ads, Meta said it also planned to apply its new system to check the targeting of ads related to employment and credit. The company has previously faced blowback for allowing bias against women in job ads and excluding certain groups of people from seeing credit card ads.

The issue of biased ad targeting has been especially debated in housing ads. In 2016, Facebook’s potential for ad discrimination was revealed in an investigation by ProPublica, which showed that the company’s technology made it simple for marketers to exclude specific ethnic groups for advertising purposes.

In 2018, Ben Carson, who was the secretary of the Department of Housing and Urban Development, announced a formal complaint Against Facebook, accusing the company of having ad systems that “unlawfully discriminated” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems did not deliver ads to “a diverse audience,” even if an advertiser wanted the ad to be seen broadly.

“Facebook is disccriminating against people based upon who they are and where they live,” Mr. Carson said at the time. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

The Justice Department’s lawsuit and settlement is based partly on HUD’s 2019 investigation and discrimination charge against Facebook.

In its own tests related to the issue, the US Attorney’s Office for the Southern District of New York found that Meta’s ad systems directed housing ads away from certain categories of people, even when advertisers were not aiming to do so. The ads were steered “disproportionately to white users and away from Black users, and vice versa,” according to the Justice Department’s complaint.

Many housing ads in neighborhoods where most of the people were white were also directed primarily to white users, while housing ads in areas that were largely Black were shown mainly to Black users, the complaint added. As a result, the complaint said, Facebook’s algorithms “actually and predictably reinforce or perpetuate segregated housing patterns because of race.”

In recent years, rights groups have also been pushing back against the vast and complicated civil advertising systems that underpin some of the largest internet platforms. The groups have argued that those systems have inherent biases built into them, and that tech companies like Meta, Google and others should do more to bat back those biases.

The area of ​​study, known as “algorithmic fairness,” has been a significant topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have sounded the alarm bell on such biases for years.

In the years since, Facebook has clamped down on the types of categories that marketers could choose from when purchasing housing ads, cutting the number down to mass and eliminating options to target based on race, age and ZIP code.

Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it was “essential” that “fair housing laws be aggressively enforced.”

“Housing ads had become tools for unlawful behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea they were either targeted for or denied housing ads based on their race and other characteristics.”

Meta’s new ad technology, which is still in development, will occasionally check on who is being served ads for housing, employment and credit, and make sure those audiences match up with the people marketers want to target. If the ads being served begin to skew heavily toward white men in their 20s, for example, the new system will theoretically recognize this and shift the varied ads to be served more equitably among and more audiences.

“We’re going to be occasionally taking a snapshot of marketers’ audiences, seeing who they target, and removing as much variance as we can from that audience,” Roy L. Austin, Meta’s vice president of civil rights and a deputy general counsel , said in an interview. He called it “a significant technological advancement for how machine learning is used to deliver personalized ads.”

Meta said it would work with HUD over the coming months to incorporate the technology into Meta’s ad targeting systems, and agreed to a third-party audit of the new system’s effectiveness.

The company also said it would no longer use a feature called “special ad audiences,” a tool it had developed to help advertisers expand the groups of people their ads would reach. The Justice Department said the tool also engaged in discriminatory practices. Meta said the tool was an early effort to fight against biases, and that its new methods would be more effective.

The $115,054 penalty that Meta agreed to pay in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.

“The public should know the latest abuse by Facebook was worth the same amount of money Meta makes in about 20 seconds,” said Jason Kint, chief executive of Digital Content Next, an association for premium publishers.

As part of the settlement, Meta did not admit to any wrongdoing.

Leave a Comment

Your email address will not be published. Required fields are marked *