Skip to main content
News Plus 1 Sep 2021 - 3 min read

Attack of the sex bots: brands and AFL teams hire automated comment ‘security’ to stop trolls and porn bots destroying sales

By Sam Buckingham-Jones - Senior Writer
Bots

A wave of sex bots is now competing with trolls to cripple brands' social pages – just as Australian firms pile into social commerce, behind only China and the US. Facebook and Twitter don't want to own the problem, but some tech firms have stepped into the void with technology that neuters the sex bots and leaves the trolls shouting only to themselves, while reportedly boosting ad spend returns. 

 

What you need to know:

  • Brands all over the world saw a wave of sex bots posting "soft porn" links on their social posts about a month ago, causing havoc for social media teams and hurting sales.
  • Sexually explicit material on social posts – as well as overly negative or abusive comments – can have a direct impact on sales, reputation and individuals. 
  • But as Australian brands pile into social commerce, they will find Facebook and Twitter refuse to take ownership of the problem.
  • But some tech firms are stepping into the void, working with big brands, retailers and AFL teams to silence the trolls and neuter the sex bots – trolls do not even know that nobody else can see their post.
  • One such tool has also reportedly helped improve brand return on ad spend (ROAS) by 34 per cent.

Social security

In a shopping centre, a person yelling racial slurs, abusing staff, or displaying sexually explicit images is rapidly tackled by police or security. Social media is a different story.

Yet social commerce is where the big platforms are heading in a race to catch Amazon. And where the money goes, the sex bots follow.

A recent report from e-Marketer found e-commerce sales in Australia grew by 53 per cent year on year. More than 30 per cent of Australians have made a purchase through social media – the third-highest rate in the world, after China and the US.

“Brands can control what happens in their brick-and-mortar stores, but what happens in their digital store – social media has been the Wild West. Two billion people can say whatever they want about your product, about your brand, about your service, your customer service, you name it,” said Erik Swain, president of moderation platform Respondology.

And while most brands say they welcome negative feedback – an open discussion with customers about bad experiences can be important in identifying issues – people feel far more comfortable sharing vitriol online than offline, which often have a direct impact on sales.

“The way some clients have fed it back to me is they view social comments as the new product review section, like Amazon review comments,” per Swain. But trolls and bots are requiring increasingly sophisticated solutions.

Respondology has developed The Mod, a platform that filters comments by pre-determined lists of key words – mild swearing, severe swearing, sexual references and LGBTQ references, for example. There are many similar products. But Respondology also employs a team of more than 1,000 human moderators – known as Responders – who can decide if an unflagged comment is appropriate. Those who left the abusive or offensive comments can still see them, but the broader public cannot. The troll rarely realises they have been filtered, Swain said. Nobody can hear their screams and they don't even know.

The firm works with multinational FMCGs, major consumer retailers, and global financial services companies, but the nature of comment moderation means those brands prefer to remain anonymous. Camera company GoPro and blender maker BlendJet, both highly dependent upon reviews, are two brands unafraid to be identified. BlendJet reports a 34 per cent increase in return on ad spend (ROAS) after using The Mod in an A/B test on Facebook. The firm also surveyed customers and found 68 per cent read comments before buying on Instagram and Facebook.

Respondology also works with sports teams, including one Australian AFL club, to hide abusive comments from fans and exercise greater control over the team’s brand. Likewise, it works with high-profile individuals to make their social media experience better and preventing abuse from reaching them – a massive problem that is leaving some stars exiting social media altogether.

Mods and knockers

The field is not static – trolls adapt, and the social platforms are changing too. The Mod has had to change in line with the new tactics of online bots and trolls.

About a month ago, Respondology noticed a “new wave of sex bots” hitting clients, all of which commented within a few seconds of an organic post being shared. “Most people don't comment that quickly,” Swain said.

“They looked pretty innocuous, but if you clicked on them, they went out to pretty much soft porn.” So it added a tool to hide comments made within the first seconds.

“It wiped out their sex bots the next day. And it was a real popular feature.”

The social platforms could develop these tools on their own, Swain said, but aren’t interested.

“From Mark Zuckerberg down, and Jack Dorsey with Twitter, philosophically, they don't believe that they own the content of their platforms. 'They're not a publisher. They're a platform'. They connect people. What people say is not really their accountability. That’s a debatable point, but that is what Mark Zuckerberg or Jack Dorsey will tell you,” he said.

“They're dealing with political interference. Facebook's dealing with people posting violent videos like live murders, so they have some issues to deal with before they even get to the comments.”

Luckily, others are filling the void, leaving the trolls and the sex bots with just each other for company.

What do you think?

Search Mi3 Articles