Listen to Is AI Too Powerful?- United States V. Meta Platforms | Episode 27
In this episode, Cassie, Ali, and Danielle discuss the case of United States V. Meta Platforms.
Meta’s housing advertising system was found to discriminate against race, color, religion, sex, disability, familial status, and national origin, in violation of the Fair Housing Act because of certain AI features. Tune in to learn more about this case, and what Meta did to correct this!
Do you have a particular topic you want us to discuss? Reach out on Instagram @agentsunfiltered
ABOUT US -🎙 Agents Unfiltered SEASON 2 is about a deep dive into some wild real estate cases and give our take! Get ready for some relatable and relevant information about “The Do’s, the Don’ts & the What The Fuck’s of Real Estate.”
Hey everyone, and welcome to Agents Unfiltered, where we talk about the do’s, the don’ts, and the what the fucks of real estate. I’m Ali. This is Danielle. And I’m Cassie Day. Join the three of us every week as we deep dive into some wild real estate cases and give our take. We are not attorneys and want to make it clear that this podcast or any linked materials should not be construed as legal advice, nor is this information a substitute for professional expertise.
We are not lawyers, financial advisors, doctors, or mortgage lenders, but we are your new BFFs.
Ali: All right, so I am going to be telling you guys about Meta, which is Facebook. They settled a fair housing violation allegation against the US DOJ, Department of Justice.
Cassie: Also, is it Meta? Like who sued who?
Ali: USA sued Meta. So, Meta, formerly known as Facebook recently entered into a settlement agreement with the US Department of Justice, the DOJ, over claims that their advertisement delivery algorithms violated the Fair Housing Act. I don’t know if any of you have ever set up a Facebook ad.
Ali: Facebook helps you when they say just enter things about your target audience and they’ll use their algorithm to get it in front of the right people based on what you tell them. So, basically at the core of this, the Department of Justice was saying that the analytics that Facebook was using for its advertisements go against the protected classes under civil rights.
Cassie: When did this happen? They don’t specially target audiences.
Ali: December 28th, 2022.
Danielle: Oh, wild. Because I know how frustrating it is trying to set up ads because just the age thing, you know, you can’t. It’s 18 and over, and that can be frustrating when you live in a college town and you’re trying to maybe not really market to college students because that’s probably not what they are looking for, but age is a protected class.
Ali: I know.
Danielle: That’s so interesting because that’s been a thing for so long, so I am curious about what specifically this case brought out.
Ali: Yeah, just like some of the biggest bullet points and takeaways from this whole thing is that:
Advertisements should never indicate a preference or limitation based on fair housing-protected characteristics such as race, color, religion, sex, disability, family status, and national origin.
There are so many things and they have a separate document about the original complaint, and there’s like a lot to read through, but some of the stuff that they pointed out were things that I wouldn’t even think about, because Meta and AI are just so smart now, like certain people are part of certain Facebook groups. Certain people with families are part of other Facebook groups that other people aren’t, and that all also goes into their algorithm of families looking for homes and this age range is looking for homes. There are so many intricate things that you can do and personalize things that you can do on Facebook. AI is smart enough to make that all part of its data and help the algorithm towards your ads. It just goes deeper than we can probably even think of what the computers can understand versus what we can, because no matter what we choose, if AI is making an ad for us in the sense of like we get the target group, but all the information that it has that we don’t understand that it’s able to make sense of, like, it just went super into it and it was a little bit over my head, but AI can be like, oh, they’re part of certain Facebook groups and so they’re part of like this group of people because that’s what it is to an AI. It’s an algorithm and its data. It’s not personalized things like we think of when we think.
Danielle: So, was the lawsuit more so based on the intended audience or the ads themselves?
Ali: Basically, they reached a settlement, including no longer using this special ad audience previously called the lookalike audience because it mirrored the kind of people that they thought should be looking at the ads. Not like Facebook, but AI. That’s all just data to AI, so it doesn’t think of the fact of fair housing as we do, because it doesn’t look at it as protected classes or right or wrong. It’s just strictly data to AI.
Cassie: I thought the special ads category though, was there for that purpose where you couldn’t discriminate.
Ali: Maybe within the past year?
Cassie: No, but even years before, there was like a special feature.
Danielle: Correct me if I’m wrong, this is what I’m gathering from what you’re saying is that it’s not so much the special ad category that we’ve been selecting.
Cassie: The one that you have to if it’s housing.
Danielle: So, I don’t think it’s so much that it’s what AI did with the ad after it was submitted. Like you submit it and obviously it can’t have this, it can’t have that, you can’t narrow your audience, whatever. It is human-based, but then when it went out into the Twitterverse, AI was directing it to certain audiences and avoiding others’ interests based on it. Is that what you’re saying?
Cassie: It’s kind of over my head because AI is insane.
Ali: It’s not in regard to the wording that agents put in the post.
Danielle: It’s like you pick this category, you pick your target audience which you can’t really do with a special ad, but even with that, I think about all of the times we’ve submitted an ad for review because you select the special ad category, then it goes to review and how many times it’s kicked back. I’m like, what the hell? Like there’s nothing wrong with this ad. It would be like one little thing. Whatever Facebook would kick it back for, Meta now.
Ali: I’m just gonna read this straight from the paragraph saying the three main things wrong with it because maybe it made sense in my head but didn’t make sense coming out of my mouth, but maybe what came out of my mouth will make sense when I read you this.
Danielle: Let’s decipher this.
Ali: The DOJs complaint alleged that Meta’s Advertisement services violated the Fair Housing Action in three distinct ways:
- The platform allowed housing advertisers to use Facebook-created categories based on FHA-protected characteristics and targeting, or exclude the audience receiving housing ads.
- Facebook had a lookalike audience tool, which used FHA protective categories to identify additional users with similar characteristics as an advertiser’s ideal audience. (That’s where the AI comes into it because that’s an additional thing that they offer where okay, you’ve got your audience that you’re looking for, they can reach out and AI decides people that fit in that interest.)
Ali: You can’t see my quotes cause this is a podcast, but like quote unquote “fit inside the target audience that you are looking for or like trying to stay away from”, so AI takes it to a different level, which is helpful unless you have certain rules you need to follow. Like if we were selling something like jelly beans, it’d be a little bit safer because it would just push our stuff out to all the like-minded people, but when you’re going off people who interact with these ads, and then finding people just like them everywhere else, you’re finding a specific crowd.
Danielle: That makes sense.
Ali: Lastly, Facebook’s ad delivery algorithms used FHA-protected characteristics to determine which consumers could see the advertisement.
Danielle: So that’s exactly what it was. It’s not so much the special ad category where you entered it, it goes into their algorithm and it only pushes to certain audiences. I think this is what it is saying. You could have an ad that would go out to an entire community, but then certain people within that community wouldn’t see the ad because it’s targeted more towards the group that AI came up with, so you have a whole group that’s being left out of even seeing the ad.
Ali: It’s trying to narrow things down for your specific target audience, which for something other than real estate would be super helpful.
Cassie: I wonder how they determined all of this. Like, how did they come to discover this?
Danielle: I think about all of the random ads I see that I do not care about, and I’m like, I don’t even know why they pop up. It’s just so interesting, how do I know that no one else sees that ad? Or vice versa. This is way outside of my pay grade, but like how do you know what AI sends things to?
Cassie: Who checks into fair housing?
Ali: Department of Justice.
Cassie: I wonder if they’ve been monitoring somehow where these protected class ads are going in front of.
Danielle: I think it is very smart because it’s something that you obviously get really out of control. I think about it the most, like thinking about gentrification.
Cassie: Oh yeah.
Danielle: What if that, that actual community is not seeing homes coming up for sale that they could purchase and instead it’s only going to people that are outside of the area? That could really affect the makeup of a neighborhood or community. So, this happened in December of 2022. What was the ruling?
Ali: They came to a settlement. Meta agreed to make significant changes to its housing ad delivery methods, including no longer using the special ad audience feature. It took that out because, I mean, it just goes off data itself. They can’t change the fact that it’s going off just like facts on who clicks on things, but that also doesn’t mean just because a certain age demographic isn’t clicking on a certain thing that they shouldn’t be able to see.
Cassie: I guess I haven’t done an ad recently. How are they making sure that agents are staying accountable if there’s no special ad category?
Ali: Because there is now. There’s a special ad category for it, but you have to say that it’s with real estate. You have to say that this is a post for real estate and through that you can’t choose as many target d points, and they don’t have their special ad delivery audience previously called the lookalike audience. That doesn’t get used with your ads.
Cassie: So they changed how they’re delivered.
Ali: The special ed category has been there for housing. It’s been there for a long time, but the way it works probably changed.
Ali: They have to develop a new system for housing ads to address disparities between advertisers, targeted audiences, and the group of Facebook users who will actually be delivering the ads. In addition, Mead will pay a maximum civil penalty of $115,000, and a third-party reviewer will be appointed to monitor Meta’s compliance with the terms of all of these things in the agreement.
Danielle: Okay. Wow.
Cassie: I mean, I feel like cost-wise, that’s not that much. I thought it was gonna be like $115 million, and even that’s probably nothing.
Ali: I feel like they probably were able to get that low of a penalty because they’re actively working with the government to make sure they’re doing it right, and I’m sure it’s going to cost a lot of money too.
Cassie: I mean if they choose to make something that can work with real estate agents.
Danielle: It’s interesting cause I just got this email the other day and I haven’t even looked at it because where’s the time, but it was from Meta talking about new services for real estate agents. I should probably look at it. I think they’re trying to somehow work with MLSs and stuff, so it was really interesting. I just haven’t had a chance to really dive into it yet, so I wonder if that is part of this whole new thing.
Ali: Like it’s not bad to be monitoring what’s on the internet at all. My brain cannot really comprehend the logistics of it.
Cassie: No. It just goes over my head. I wonder how long they have to have a third party reviewing them, like watching, is that like now a permanent practice?
Ali: It’s a law office in New York.
Cassie: And is it Meta paying them?
Danielle: I’m sure. So any future changes to targeting options must be approved by the DOJ prior to implementation in the US district court. The Southern District of New York will retain jurisdiction over purposes of resolving disputes and entering any modification orders. So, I mean, it’s staying within the Department of Justice.
Ali: That’s kind of wild. I think in general, advertising is just so difficult for real estate. It made me think about just in general, like Google AdWords, and SEO, it’s gotta be so tricky. Given, when I’ve worked with SEO before, search engine optimization, for anyone who doesn’t know what that is, you create your ads like Google AdWords is a way you can do it, but you have to pick all these tags. I’ve done that in the past. I used to do it for work a little bit, but I’ve never done anything real estate related so that just would be so hard. It’s all about getting your ads in front of your target audience, and if you have to be so broad, how are you really hitting a target audience with that stuff? Except for location.
Danielle: When we used to do special ad categories before with Facebook, that’s exactly what it was, it was like you could only do basically location, but you couldn’t target gender, age, or anything like that.
Cassie: Or interest.
Danielle: It’s just really interesting. But that’s what’s on the horizon. That’s where we see 99% of advertising at this point online.
Ali: I’ve been seeing all these things lately where AI, if you type in something that you want to write a paper on, It’ll write full 20-page essays and it’s amazing. Absolutely amazing what it writes, and it sounds like a person writing like it’s just crazy the advancements that have been made. We can implement rules on it right now, but soon, like are we really going to be able to
Cassie: I know. It like takes the human element out of it. If there are systems that can write a full essay, are you learning anything?
Ali: I saw this thing on TikTok the other day and it’s this girl who, she’s super business savvy, talks about finances a lot, but was also talking about upcoming side hustles or careers you can do for yourself that not a lot of people know of right now, but are gonna be something that you’re like, man, I wish I got in on that. A lot of it is AI does all of your content writing, so you can charge people like 200 bucks an ad, but really you spend two minutes because AI will just write everything for you and it sounds better than a human at this.
Danielle: I feel about that. It feels so gross.
Cassie: It does. I understand that everybody needs to make a living, and I get that, but it’s like integrity and taking advantage of someone, like, I’m gonna charge you an arm and a leg for something that I haven’t even done, or really don’t have any skin in the game on.
Danielle: I really struggle with that whole mentality, and I just am so curious to see where it heads from here. It’s a little disturbing to me. Not a little, it’s a lot disturbing to me.
Cassie: It just takes the human element out of it and I’m like, what good are we if we’re not doing anything ourselves? what are we all gonna do with ourselves if we have nothing to do?
Ali: Scientists are calling us like, the fuck around and find out generation because everyone fucked around and now we’re finding out as we’re living right now.
Danielle: It’s real. It’s terrifying. You think about it, and I feel like I’ve listened to so many things and read so many things about this, For our generation, millennials didn’t grow up with technology from when we were super tiny. It may have been a round, but it wasn’t like in our pocket, and so it’s been really interesting to watch like we still know how to do all of these things.
Ali: We learned with it as it grew.
Danielle: Exactly, and now it’s at this place where it’s like the first generation is now adults that grew up with it, like with them all the time, and I’ve seen so many articles on basically like the de-evolution or however you say that of our generations. It’s like advancing in certain ways, like immensely, and it’s amazing, and then there’s this other part of it.
Ali: No one actually just knows anything because you know you can have the answer in your pockets. Why retain it? I mean I noticed that with myself. I don’t hold on to certain information just because I know I can find it anytime.
Danielle: it’s really interesting. I actually deleted TikTok off my phone because I just was like, this is mindless. It’s so interesting and entertaining and I do learn things there, but then it becomes like habit forming. Just like, oh, just scroll, scroll, scroll. I gotta stop this shit because my kids are watching and I can’t be a bad example to them, but it’s all around them all the time. It’s crazy. I didn’t get my first smartphone until I was an adult, and I was a flip phone for all of high school. Elementary schoolers have smartphones. Mine don’t. We decided that like high school is when my kids could have smartphones, but even then, I am monitoring what she’s using and don’t want her having apps like TikTok which it’s just mind-numbing.
Cassie: The problem is, and this is what concerns me as a human being, is that then you get people who’ve grown up and they can’t survive without being in front of a screen, and there’s no more enjoyment of what’s out there. There’s no adventure. It’s just living through a screen and posting. I don’t know if you guys saw this, but I saw an article on this one gal who’s an influencer, I don’t know, TikTok, wherever, somewhere on social media, and she did a photo shoot in IKEA and told people she was in Bali. It was kind of like a joke, like not a joke, but just like an experiment to see what people would buy, like if they would buy it or not, and they did. So many people were like, oh my gosh, it looks so amazing, and she did this whole photo shoot in IKEA and the only things that she left in there to give it away for people who were smart enough to catch on were little Easter egg decorations that IKEA has, but she just did it in different rooms. Think about that. People are just living their lives through this social media experience instead of actually living their lives.
Danielle: Oh my gosh. This is a whole conversation.
Cassie: Weird. It scares me.
Danielle: Well, yay for Meta. Thank you for listening to our podcast. Make sure to follow us on Instagram and TikTok. We are @agentsunfiltered. Have a lovely day.