Federal housing officials have accused Facebook of allowing housing discrimination on its platform, the latest fallout from a saga that began with a 2016 investigation into the social media giant’s advertising arm.

The U.S. Department of Housing and Urban Development filed a formal complaint against the social network this week, claiming Facebook allows landlords and home sellers to hide housing ads from people based on their race, color, religion, sex, familial status, national origin, disability and zip code. Such behavior allows landlords and home sellers to limit housing options for those groups under the guise of targeted advertising, according to the HUD complaint.

“The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse,” Anna María Farías, HUD’s assistant secretary for fair housing and equal opportunity, wrote in a Friday news release. “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”

Facebook will respond to the complaint in court, and will continue working with HUD to address the concerns, a company representative wrote in an emailed statement.

“There is no place for discrimination on Facebook; it’s strictly prohibited in our policies,” the representative wrote. “Over the past year we’ve strengthened our systems to further protect against misuse.”

Advertisers have the option to prevent Facebook users from seeing housing ads if they have expressed interest in accessibility issues, including assistance dogs, mobility scooters and deaf culture; have young children or have expressed interest in family issues including child care or parenting; expressed interest in geographic regions including Latin America, Southeast Asia, Honduras or Somalia; or have expressed interest in religions such as Sikhism, Hinduism or the Bible, according to the HUD complaint. Facebook also allows advertisers to draw a “red line” around zip codes, and hide housing ads from users who live within the line, according to the complaint.

The complaint follows a 2016 investigation by nonprofit site ProPublica that found Facebook allowed advertisers to exclude groups based on their “ethnic affinities.” The news organization purchased a housing ad on Facebook’s platform that excluded people who had African American, Asian American or Hispanic ethnic affinities.

The month after that investigation was published, a Facebook executive announced the company was building new tools to detect and automatically disable ethnic affinity marketing for housing, employment or credit ads. But in November, more than a year after its initial investigation, ProPublica found Facebook still allowed advertisers to hide housing ads from users based on those affinities.

Facebook’s advertising issues highlight a broader, growing problem in the tech industry, said Samuel Woolley, research director of the Digital Intelligence Lab at Institute for the Future, a Palo Alto-based organization that publishes a toolkit intended to get tech innovators to think about the ethical implications of their products. Social media companies including Facebook, Twitter and YouTube have faced recent accusations that they fail to adequately police their platforms for misleading or harmful content.

“Often times technologists and engineers build products with the bottom line in mind, they build products with quick turn around in mind,” Woolley said. “But it’s become really clear that social media companies have not in the past built products with society in mind, and with ethics in mind.”