EX CIA, Facebook elections integrity head warns industry on social media advertising models
Social media algorithms that intentionally addict users to deliver engagement for advertisers and brands represent the core of a fundamental problem with social media business models, says a former White House special advisor, CIA officer and global head of elections integrity ops in Facebook's business integrity organisation. Eroding civics and rising mental health issues are also flashpoints that place exponential accountability on the $1 trillion-plus global marketing and advertising sector.
Yael Eisenstat arrived in Australia last week loaded with weighty world issues, centred around the way the tech platforms' advertising business models are driving social fractures and increasing public health concerns. Media, perhaps surprisingly, gets a slightly longer leash in Eisenstat's worldview, although it is not without blemish.
Keynoting the CSIRO's annual D61+Live tech and innovation summit with 3,000 delegates last week, Eisenstat is part of a growing chorus of high-profile tech commentators pushing alternative solutions to fix polarising social discourse and highlighting the causal links from the advertiser-driven attention and engagement economy.
Social media platforms are "intentionally keeping us addicted", says Eisenstat, arguing it is more dangerous than the media arms race to attract and monetise public attention.
"Let's just be really clear - I'm no techno-pessimist," she says. "I just have an issue with this particular business model of trying to ensure that you're addicted to their technology so that they can target you with ads - it's causing so many issues. There's mental health issues, but for me civil discourse is what I'm passionate about. Because in order to keep you on their screens, they have to serve you up more and more extreme content. I don't think it's the engineers ... the algorithm has figured it out because the algorithm was trained with the intent of keeping your eyes on the screen," she says.
"If it just shows you slightly more extreme content than the last one you saw, you will click."
And this is where it gets challenging for the advertising industry and brands.
Salesforce CEO, Marc Benioff, has already labelled Facebook the "new cigarettes, it's addictive, it's not good for you ... government needs to step in". But Eisenstat, now a fellow at New York-based Cornell Tech, has a detailed take on what government and industry should do next.
"Advertising in and of itself is not necessarily the problem," she says. "It's the fact that these companies have taken all of the advertising away from essentially regulated markets ... They know exactly how to target us with exactly what we want to see. Why is that a problem? Because in order to do so, the number one metric they have to care about is user engagement. They have to keep you on their platform."
Regulate digital curators
Eisenstat met with Communications Minister Paul Fletcher's office during her visit and has two key policy proposals which she says would start to address some of the issues with the business models of the tech platforms.
The first is to change legislation drafted in the 1990s known as Section 230 of the US Federal Communications Decency Act. It protects "platforms" from responsibility for the content they carry. "It's basically a subsidy that says internet platforms will not be responsible for the content they host. If you've ever heard the fight about whether Facebook is a publisher or neutral platform, this is why that fight is happening. It's all about Section 230," says Eisenstat.
She's in the camp that doesn't want Facebook regulated like a publisher, which is a widely-held position among media companies across Europe, the US and Australia. "What they're doing is more dangerous," she argues.
"They're curating our content. They are deciding what we do and do not see. They are deciding which rabbit holes to take us down and which bubbles to put us in. I would call them a digital curator, whatever term you want to call it, and regulate that way because you cannot tell me they bear no responsibility for the content that they put out there - if in fact they are curating your content.
Eisenstat believes that is the one regulation Facebook will "fight tooth and nail" to leave as is.
"If they actually [have to] bear responsibility for the content on their platform, they claim it will destroy their business. I say it just destroys your business model and you should figure out a different way to operate Facebook. I use Facebook. It's the reason I stay connected to people all over the world. This isn't about hating Facebook. It's about accepting the fact they don't bear responsibility."
“I use Facebook. It's the reason I stay connected to people all over the world. This isn't about hating Facebook. It's about accepting the fact they don't bear responsibility."
It's why one of the solutions she favours is to force transparency around the algorithms used in recommendation engines. Eisenstat is also a policy advisor at the Centre for Humane Technology, co-founded by Google's former lead design ethicist, Tristan Harris. She cites the case of a former YouTube staffer working on its recommendation engines who is now also at the Centre for Humane Tech.
"There are examples of teenage girls, they go on YouTube looking up some video, maybe about swimsuits for the next season and within a few videos they're being shown anorexia videos," she says.
"One of the statistics I heard from him was that on the day the Mueller Report dropped this year [into Russian interference of the 2016 US presidential election], the number one recommended source of news, not by your friends, not by what you searched for but by the recommendation engine, was Russia Today."
Russia Today, or RT as it is now branded, is a news service funded by the Russian government. It is regarded as a state propaganda vehicle.
"So this is the point; in order to keep us engaged, these recommendation engines are bringing us down more and more extreme paths," says Eisenstat. "It is wrecking our ability to even see that we're being manipulated like this. Government's job is to protect its citizens and if our democracy is being harmed, if young girls are falling more and more into teenage depression because of Instagram, then it's government's job to step-up."
Reveal the algorithm
Beyond making platforms legally responsible for the content they carry, which critically, means proactive content takedowns rather than relying on others to identify and request such action, Eisenstat says much greater transparency around recommendation algorithms is required.
"There are backdoors into how to force responsibility on these companies," she says. "I'm a big fan for actually making recommendation engines transparent. If platforms actually had to show that 97 per cent of whatever was recommended by YouTube, then these companies would be held accountable. It would show that they do bear responsibility for what they're showing us."
"Media don't need to suck up every ounce of your data and use your human behavioural data against you in order to manipulate you to keep you on their screens. This is what is happening in the social media industry.”
Monetising different realities
Eisenstat has her grievances with traditional media although she argues there is a fundamental difference with platforms and their content.
"Media don't need to suck up every ounce of your data and use your human behavioural data against you in order to manipulate you to keep you on their screens," she says. "This is what is happening in the social media industry because every single person is seeing a different reality. They are serving up ads that match what they think you are going to get sucked into - and that is a very big difference," adds Eisenstat.
"I do think we are going to see a shift. Some of it will be legislated so they won't have a choice. I do think there is an appetite in the US to tell tech companies to slow down."
That includes a new Silicon Valley Stock Exchange, also called the Long Term Stock Exchange, designed to invest more for the long-term.
"I don't know if it will succeed but the only way you can actually build and think about potential, unintended consequences is if you're able to think beyond quarterly returns and quarterly metrics," Eisenstat says. "You have some Silicon Valley tech leaders advocating for national privacy laws, who want to level the playing field because they're all suffering the downstream effects of this lack of trust because of what's happened with Google and Facebook."
And finally, what actually happened during Eisenstat's short, six-month stint last year as the global head of Facebook's Elections Integrity Ops, which was part of the business integrity unit?
"On day two they took away my title," she says. "They took away my ability to hire the team they said I would get and they cut me off from attending any high-level meetings. I learned a lot.
"It was interesting but I was just not empowered to do what I was actually brought there to do. One thing that was never, ever discussed while I was there, or I heard discussed, was the fundamental problem, which is the business model.
"We talked about all the whack-a-mole approaches, all the pages you might take down, all the different ways to build in disclosure around political advertising. But never what was the actual underlying systemic problem driving all this."