Enforcement, class actions incoming: $32bn REA Group hedges against privacy reform twists as industry undercooks twin threat

REA privacy lead Andrea Farrell sees two rather large problems about to hit: "Enforcement via the Office of the Australian Information Commissioner, but also, law firms could start identifying holes and launch class actions."
The media industry has learned to live with fear, uncertainty and doubt sown by Google rug pulls, regulatory half-measures and a belief that lawmakers won’t follow through with hard privacy rhetoric. But REA Group is taking no chances – and reckons the ACCC, OAIC and tooled-up advocacy groups are on the money when it comes to consumer privacy concerns. Whether or not Australia’s lawmakers go the whole hog on privacy post-election, the Murdoch-controlled firm is bidding for clean-skin status, or as close as commercially possible – for two good reasons – and its efforts have not gone unnoticed by Privacy Commissioner Carly Kind.
What you need to know:
- Amid industry scepticism that regulators will follow through with hardest-hitting Privacy Act reforms, REA Group is taking no chances.
- The Murdoch-controlled group is hedging against boundary testing and potentially vigorous enforcement of existing powers and tooled-up consumer-lawyer class actions.
- REA has rewritten privacy policy so an 11-year-old can understand what’s being done with their data, how it is being used for personalisation, and how its automated decisioning works.
- The group claims it has made it really easy to delete everything – which removes the kind of risk that wiped $1.8bn off Medibank’s value overnight.
- REA's proactive approach has not gone unnoticed by Privacy Commissioner Carly Kind, who's now intent on seeing just how far existing Privacy Act legislation can be applied.
The word on the street is that [the second tranche of Privacy Act reforms] may never happen, which probably means that the Privacy Commissioner is going to go even harder on enforcement with the powers that they [already] have.
Long view
Channelling Sigourney Weaver in James Cameron’s Aliens (“I say we take off and nuke the entire site from orbit. It's the only way to be sure”) $32bn property platform REA Group has rewritten its privacy policies so that the average year seven student can understand what is being done with their data and how the stuff users subsequently see is personalised and gone maxi on data retention minimisation.
It's basically aiming for privacy gold standard by default, regardless of how far the current or next government decides to take the raft of proposals tabled over the last few years as Australia belatedly updates its Privacy Act for the digital age.
REA therefore stands a far better chance of proving user consent, which happens up front when users hit the site, is informed, i.e. they know what they are agreeing to, per head of product – personalisation & privacy, Andrea Farrell.
Informed consent is critical, because if people don't understand what they are agreeing to – and have it explained to them in very simple terms – companies cannot legitimately claim people have knowingly agreed to have their personal information used for many of the practices marketers now take for granted,
"We’re making sure that we are capturing the right metadata and consent signals, which then flow through that whole personalisation experience," said Farrell.
Which means everything after that should be compliant under a beefed-up Privacy Act. Or at least defensible.
Meanwhile, it means REA’s millions of users feel a little more at ease about the data and intent signals they are sharing.
"Transparency builds trust,” said Farrell. “We want to be very transparent with users about the information that we capture and provide them with a strong value exchange."
Hacked off
That trust is easily lost: Farrell has been heading personalisation and privacy at REA for three years. A few months into the gig, Medibank, an insurer with sensitive data on 9.7 million Australians, got hacked and held to ransom.
A year earlier, consumer research had underlined that more than 90 per cent of Australians were uncomfortable with how their data was collected and shared en masse by brands and the digital economy – a finding that has remained consistent.
"Despite regulation being slow, community expectations were high," said Farrell.
The 2022 breach concentrated corporate minds. Medibank’s share price tanked near 20 per cent overnight, a $1.8bn hit. It also provided a sharp wake-up call to consumers conditioned to handing over personal data without much thought – from being uncomfortable with data collection practices, but clicking ‘accept’ anyway, they scrambled to put the genie back in the bottle.
"When the Medibank data breach occurred, the number of data deletion requests that we received just skyrocketed, whether that was app deletion or requests to delete their personal information, maybe in a rental profile or something like that," Farrell told a privacy forum hosted by customer engagement platform, Braze.
Farrell said REA immediately reviewed its data privacy policies – and hasn’t looked back. While much of the industry remains sceptical that Australia’s lawmakers will follow through with more stringent aspects of the proposed Privacy Act reforms – and even the Privacy Commissioner Carly Kind has previously expressed dismay at the watered down first tranche legislation laid before parliament last September – REA is preparing for tranche two. Or for the Commissioner to start pushing the boundaries under existing powers, which Kind has indicated she is now preparing to do.
Which for REA has meant broadening data privacy work beyond the compliance department.
"We created a centre of excellence where we centralised the ownership of the platform, the people, the process and the technology to ensure that we were getting the most out of the platform and to encourage others to join, so that it was easier to convince them to join and deprecate other products," per Farrell.
For consumers, “It clearly explains our AI, what our algorithms do, and automated decision-making,” she added, while cross-departmental ownership also reduced the risk of siloed, non-compliant data practices lurking in the shadows. "One of the updates to the privacy law changes is transparency around automated decision-making... We believe we’ve done that."
Since then, and with AI use surging, REA has also developed robust governance for the third party AI tools being plugged in to the platform. In short, said Farrell, “When we use third-party tools, we avoid sharing PII [personally identifiable information]... There’s a very strict due diligence process around any adoption."
Delete > deleterious
If people want full erasure, no problem, added Farrell.
"We want users to feel that they can delete the app, delete their data, and associated data will be deleted at the same time."
Which appears to be a win for customers, and a win for REA’s compliance team.
“If you’re not holding it – it’s not a risk … [Regulators] are now drilling down into the existing privacy principles, and they will definitely use their enforcement powers to target, not only social media platforms, but Australian businesses in general,” said Farrell.
“The word on the street is that [the second tranche of Privacy Act reforms] may never happen, which probably means that the Privacy Commissioner is going to go even harder on enforcement with the powers that they [already] have.”
But there are other risks within what has already been laid before parliament.
“There are now two prongs: there's enforcement via the Office of the Australian Information Commissioner, but also, law firms could start identifying holes and launching class actions. So there's two problems," said Farrell.
While many firms are carrying on as normal, seeking regulatory clarity before changing their data collection and trading approach, REA's proactive hedging has been noted.
"It is actually encouraging to hear that from REA," said Privacy Commissioner Kind. "I think that entities are informed by a lot of different factors. One of those was Privacy Act reform, but others continue to be data breaches, public sentiment and public attitudes, and I would hope the actions of us as a regulator as well."