Unilever CMO Threatens To Pull Ads From Facebook & Google
In one of the hottest digital news stories right now, the Unilever CMO Keith Weed threatened to pull the brand’s $9.8 billion ad spend from ‘toxic’ platforms such as Facebook & Google. Lee, Chris & Jamie discuss their thoughts. Click here to subscribe to our YouTube channel.
See below for the full video transcription.
Chris: So the next topic we’re going to talk about is the recent news regarding Unilever and their CMO basically threatening to leave Facebook and Google for all their advertising services. So Unilever, are pretty much a manufacturing and packaging giant. They own Marmite and other brands like that. And they’re accusing Facebook and Google of being swamped with poor content, basically and they don’t want their brand showing next to this toxic content. I think it’s just the same sort of thing as what happened, I think it was last year with the brand protection thing with YouTube, and L’Oreal, and M&S where some of their content was shown next to extremist content and there is a bit of a transparency issue here. It’s, you know, there’s a lot of people trying to develop transparency tools in the digital world now to make sure that brands feel secure and comfortable that their content’s being put next to trustworthy content and content that basically isn’t going to damage the brand in some way. But obviously this is a massive decision on the CMO of Unilever’s part because I think they spend in the region of around 10 billion a year in advertising, which is an absolutely enormous sum of money, so I imagine Facebook and Google are going to be shaking in their boots at this statement if I’m being honest with you. But I think it is, you know, a threat at the right time really, I think Facebook and Google need to take accountability for the type of content that advertisers are showing next to. So yeah, it just needs tidying up, really, and I think it’ll be the start of bigger things, really.
Lee: I agree with what he’s doing, Keith Weed. But if he pulled all his social ad spend, it’d do too much damage. It’s not going to happen.
Chris: It would do, but I know what you’re saying there and it’s unlikely that they will, if I’m being honest with you. They will rely on that traffic too much, but there is a real transparency and visibility issue in the market at the minute and everybody’s trying to race to develop the, you know, the perfect technology that’s going to eradicate all this content, like the extremist content that we experienced last year across YouTube. It is a massive issue, and I don’t think advertisers up until this point have really realized where their ad content is actually being served and what it’s being served next to. And just from a general brand equity standpoint, if somebody sees your brand showing next to extremist content, it’s going to be, you know, it’s going to have a detrimental effect on your brand, no doubt.
Jamie: Don’t you think consumers are wising up now to this. It’s not a human that’s putting these things next to one another, is it, it’s an algorithm. I think consumers are wising up to that now, and sort of, can understand that if these things happen, it’s not for…it’s no one’s fault. I see what you’re saying and it can be damaging to your brand. I think, more contextually, awkward situations can arise than this. I mean, I don’t know exactly what he’s saying is bad content that’s getting placed next to his, but the extremist content and stuff, wasn’t that more context? Wasn’t there more context with that?
Chris: No, that was just YouTube. That were exclusively to YouTube and not…so it wasn’t necessarily an attack. Well, it were an attack on Google because Google are YouTube. Yeah, it was specifically YouTube that, yeah, was the victim of that there, really. And they’ve done things between then and now to enforce more protection and more security against that. So they have put protocols in place to improve that. But obviously Facebook have never really suffered any brunt up until now. But yeah, you’re right in what you’re saying. I think consumers are becoming more aware. It’s been publicized so much over the last year about getting rid of this online content. I don’t think anybody’s unaware of it. People just don’t, you know, people at the top of, you know, these businesses like Unilever, they’re massively protective over the brand equity and they do want something doing about it.
Jamie: It’s just an old school way of looking at it, I always think when people bring up topics like this, it’s almost like, say you had a two-page spread in a magazine and your advert was there, and there was something that’s contextually sort of awkward or offensive next to it, obviously a human’s put that there in Illustrate or whatever graphics package they’re using. Same way with TV ads, you wouldn’t put something contextually awkward if you’re showing a documentary on something awful. You wouldn’t put something that’s contextually awful in the middle, would you? Because someone’s got to make sure that doesn’t happen but online consumers have got to start realizing that this is an algorithm that’s doing this, no one’s there to vet it most of the time and now people are expecting them to do that.
Chris: I think they do, but I think advertisers would like the control to be able to decide whether to show it next to that content or not, which I don’t think’s an unfair judgement or for somebody to say that the lack of control is the problem here. You can’t choose whether to show it next to it or not, which is a bit of a problem. There’s tons of different viewability tools and technologies that are coming now which are actually helping you get slightly close to that transparency metric but it’s still a little bit of an unknown. I think one thing that they’re working towards, I know we have discussed it a little bit before, but the Blockchain technology, which is, well, it’s a methodical way, basically of making sure that you’ve got access to the whole history of the ad being served. And I know these guys are working closely with IBM to try and develop that technology, so it might be that in another couple of years we’ve got a completely new system that’s transparent, and trustworthy, and gives people control and then there’ll be no ambiguity about whether you’re shown next to extremist content or just generally content you don’t want to be affiliated with.
Jamie: Yeah. I think at the end of the day, whoever’s got the money in their pocket, obviously… How much did you say this guy’s got to spend or his budget is?
Lee: Ten billion a year.
Jamie: You’ve got ten billion a year in your pocket and you want some sort of feature adding to this advertising platform. It’s going to happen, isn’t it? Who cares? It’s going to happen.
Chris: In a way, I like what he’s done really. It’s helping all advertisers, because unless a big dog with a load of cash and deep pockets makes a big threat like he’s done, you know, what are little fry going to do? They’re not going to have any impact on whether they change. These guys might disrupt it enough for Google and Facebook to go, “Right, guys, we need to up it here.” And then when the smaller guys are pitching advertising services and they’re getting asked an awkward question like, “How can you guarantee viewability and transparency and how can we gauge whether we’re shown next to good content or bad content?” They’ve got the confidence to then say to their brand, who they’re advertising on the behalf of, “Yes, I can guarantee that you’re not going to be shown next to X, Y, Z content. “
Lee: I think the platforms have 100% got the responsibility to develop it and get it to that stage, but I still think that, although it’s a bold move and it’s good that he’s bringing it to the media and the attention of the industry, he’s still going to put his money where the attention of the consumer is, as a consumer brand. Fundamentally, he’s never going to pull his ads from a platform where his potential customers are.
Chris: No, and I don’t think he will in all honesty with you. I think it’s just a little bit of a, “Come on, guys. Step it up.” And I think he’s done the right thing. I think he’s done it for everybody’s benefit because, obviously, if it has the effect that M&S, and BMW, and L’Oreal had on YouTube the other year, you know, YouTube went on to implement new protocols and new levels of security. If Facebook go and do that, it’s going to make every advertiser’s life easier. Even I get asked awkward questions about, you know, where we’re being shown, you know, where your ad’s been placed, especially on algorithmic placements and optimizations. If you choose automatic placement in Facebook, for instance, or Instagram, you’re just leaving it to…
Chris: Yeah, you’re just leaving it to the platform to do that. People don’t want that. Brands want more trust and control on where the content’s being served. But basically, in a nutshell, I think it’s good, and I think it’s kicked up a bit of dust, and hopefully, Facebook and Google react positively. I’m sure they will.
Jamie: Can we just go back to this guy’s name?
Lee: Keith Weed.
Jamie: Yes, that’s a great name, that. Sounds like a Mighty Boosh character.
Lee: Great name, Keith.