Skip to main content
News Plus 4 Feb 2025 - 6 min read

Clickbait hate vs consequences: Feds move to hold digital giants accountable as ad-driving algorithms fuel harms, sextortion, deepfakes soar

By Andrew Birmingham - Martech | Ecom | CX Editor

Tech giants in the frame for multibillion dollar fines as incidences of online harms soar.

A landmark report by Delia Rickard puts Big Tech squarely in the crosshairs, demanding platforms be held accountable for the online harms they facilitate – and rein-in algorithms that pump ad takings by driving increasingly extreme content. The government agrees and plans to place the platforms in the same context as Occupational Health and Safety Law. Which means they will bear a legal duty of care and be forced to prioritise safety by design, conduct real risk assessments – and, if they fail, face financial penalties even the trillion-dollar titans of Silicon Valley will baulk at. With harsher penalties, increased transparency, and oversight mechanisms that actually bite, the era of digital platforms being treated as outliers and operating with impunity might finally be ending. 

What you need to know:

  • A report by senior public servant Delia Rickard outlines 67 recommendations to enhance online safety laws, including a new Digital Duty of Care for platforms.
  • Current Australian laws fail to adequately protect individuals from online harms, prompting the Federal government to propose a shift in responsibility to online platforms, according to the review
  • The proposed changes include stricter penalties for non-compliance, with maximum penalties suggested to be increased to the greater of five percent of global annual turnover or $50 million.
  • Current maximum penalties are dismissed as "parking tickets", with a comparison made to large tech companies' market capitalisations exceeding a trillion dollars.
  • The report identifies significant issues related to online harm, such as child exploitation, bullying, and the promotion of harmful content, and while not accusing platforms of profiting from hate speech, sill suggests ways of mitigating against them doing so "Inadvertently."
  • Recommendations also focus on the need for transparency in digital advertising practices and alignment with international safety measures.
  • A proposed Digital Platforms Ombudsman scheme aims to address consumer issues related to digital advertising and ensure compliance from major platforms operating in Australia.

Stronger maximum penalties are needed to create a persuasive deterrent, especially for those online services that are among the richest global corporations in the world. Should new obligations be placed on services under a duty of care, appropriate and persuasive penalties must be in place

Delia Rickard, Author, Report of the Statutory Review of the Online Safety Act 2021

There's nothing exceptional about Big Tech except perhaps the scale of harm its services have visited upon the community according to a Federal Government report into online safety, which suggests online harms should be treated like any other Occupational Health and Safety issue — except with potentially billion-dollar fines.

Current laws can not protect vulnerable Australians from online harms and the Feds are recommending the burden of responsibility be shifted from individuals to online platforms. That includes the adoption of a singular and overarching duty of care that takes into account due diligence, safety by design principles, risk assessments, mitigation, and measurement to help prevent foreseeable harm. 

The report is authored by Delia Rickard, a senior public servant, and is a statutory review of the Online Safety Act 2021.

The report proposes 67 recommendations to enhance Australia’s online safety laws, including reforms to existing complaint schemes for victims of online harm, greater transparency obligations for digital platforms, adjustments to the governance structure of the Office of the eSafety Commissioner, and stricter penalties for non-compliant online services.

According to Michelle Rowland, the Minister for Communications, who has been sitting on the report since October, the Government has already committed to legislating a Digital Duty of Care. This approach will put the legal onus on platforms to keep users safe and help prevent online harms, she said.

"Our Government has been proactive in ensuring our legislative framework remains fit for purpose. That’s why we’ve wasted no time in committing to legislate a Digital Duty of Care to place the onus on online services to keep their users safe.

"We are committed to strengthening our online safety laws to protect Australians – particularly young Australians.

Serious penalties

The proposed penalties are much harsher than the current regime, which Rickard describes as relatively mild.

"Stronger maximum penalties are needed to create a persuasive deterrent, especially for those online services that are among the richest global corporations in the world. Should new obligations be placed on services under a duty of care, appropriate and persuasive penalties must be in place."

Under existing law, the maximum penalty is $782,500. 

"Technology companies like Apple, Microsoft, Amazon, Alphabet (Google), and Meta all have market capitalisations in excess of $1 trillion USD – and over $3 trillion in Apple and Microsoft’s case. By way of comparison, Australia’s reported GDP in April 2024 was US$1.7 trillion. It is easy to see that for these and many other companies regulated by the Act, a penalty of $782,500 would barely be considered a 'parking ticket' – and potentially even far less of a deterrent than one."

Rickard recommends that the maximise penalty should be increased to the greater of five per cent of global annual turnover or $50 million. For example, for Meta, based on its 2024 results, that would mean a maximum potential penalty of US$8bn.

The days of treating digital services as exceptions are over if Rickard's review prevails, and she says the online duty of care should be thought of in just the same way as other work health and safety legislation which she says has stood the test of time.

Enormous problems

While the digital industry may not be exceptional in Rickard's eyes, the harm it brings certainly is. According to her report, the online world has "brought enormous problems that are causing our society and individuals huge harms. These include the proliferation of child sexual exploitation and abuse material and the bullying and abuse of individuals and groups."

Rickard flags issues such as "the promotion of terrorism, ever-increasing misogyny, people withdrawing from public life for fear of abuse, and the constant promotion of beauty standards that are unattainable for most of us with resulting disordered eating."

"Mental health issues are on the rise, many are falling prey to the addictive features of services, image-based abuse and deepfakes are proliferating and matters that are outside of the scope of this review, such as disinformation, threats to our democracy and the proliferation of scams, are wreaking havoc globally."

The review contrasts starkly with the directional u-turns now being plotted by some of the biggest tech platforms towards a more aggressive libertarian approach to online content. Meta for instance recently made significant changes to fact-checking, following the template set by Twitter, now called X, on cost and moderation cutting.

Advertising impact

There are some significant implications for advertisers.

In keeping with broader privacy and online safety reforms internationally in recent years, the review calls for greater transparency regarding digital platforms’ use of algorithms and profiling to target advertisements.

It also notes the misalignment between platforms’ financial incentives (driving user engagement for advertising revenue) and the need for content moderation, particularly for sensational or harmful content that attracts attention.

"Unfortunately, the natural incentives of online platforms and other industry participants don’t always align with safety. They make their money by keeping people online and exposed to advertising. Disappointingly, it is often the sensational and extreme content that drives attention and keeps people online so the incentives for content moderation and limiting the time spent online just aren’t strong enough."

"This needs to change," she says.

Among the other ad-related issues the report recommends:

  • Platforms should incorporate safety by design principles into their advertising systems, including risk assessments for how ads may contribute to online harm (e.g. promotion of harmful practices or targeting vulnerable groups).
  • Entities with significant reach or risk should publish annual transparency reports, which should include insights into ad practices, content moderation, and enforcement of terms of service.
  • While not directly accusing any platform of intentionally monetising hate speech, the report notes that platform design can influence the nature of online communications by "favouring incendiary or extreme content" and changes in the law should help mitigate the monetisation of hate speech and other harmful content through advertising.
  • Working to prohibit harmful content in advertising – which can remain online for too long under existing laws – including the promotion of harmful practices such as disordered eating or self-harm through advertisements, requiring proactive moderation
  • Encouraging Australia to align its advertising-related safety measures with international frameworks like the EU’s Digital Services Act, which includes provisions for transparency and accountability in ad targeting.
  • Creating an Ombudsman scheme to cover disputes related to digital platforms’ advertising systems, ensuring advertisers and affected individuals can seek redress for harmful ad practices.

"If Australia is to set up a Digital Platforms Ombuds scheme it should cover all the types of consumer issues dealt with in the ACCC’s work on digital platforms (noting though that the Government is proposing that scams be dealt with by the Australian Financial Complaints Authority so that the respective roles of telcos, platforms and financial services can be considered together) as well as those covered in this report.

At a minimum, she says the scheme should require membership of platforms with the highest reach in Australia. "Given that these platforms are almost all based overseas, a licensing requirement would better ensure that services complied with this requirement."

What do you think?

Search Mi3 Articles