Trust Is The New AI Battleground
Australia rejects free AI training data and signals a global shift toward provenance, consent and cultural integrity.
Australia rejects unlicensed AI training, marking a global pivot toward accountability.
Creators and First Nations leaders warn of cultural and economic exploitation.
Trust and data provenance are becoming the core competitive edge in AI.
The Real AI Contest Is Trust vs Exploitation
Australia has drawn a line.
The Albanese Government has refused to create a copyright exemption for AI training, ending the era of free unlicensed data for model builders. The message is clear. The future of AI belongs to companies that can prove the integrity of their datasets.
Because this story is not about compute. It is about confidence. It is about cultural custodianship. And it is about something older than technology: reciprocity.
In New Zealand, when someone does the right thing even when it is difficult, you give them a chocolate fish. A small symbol of earned trust.
Trust is the real terrain of modern AI.
The Mirror Moment
Imagine an AI company quietly mining your internal systems: strategy decks, customer analytics, roadmaps, private Slack.
They release a model that mirrors your language, anticipates your decisions and erodes your edge.
You would not call it innovation.
You would call your legal counsel.
Now replace corporate data with creative data. Music, books, photography, journalism and Indigenous cultural material.
That is what has been happening at scale.
And Australia finally said no.
The Government Says No
In October 2025, Attorney General Michelle Rowland confirmed:
“…we have ruled out a text and data mining exception, and that’s to provide certainty for Australian creators.”
A short line. A global signal.
Creative work is not free input for model training. It is owned, valuable and protected.
Culture and Compliance
At the Senate hearing on 30 September 2025, creators spoke with striking clarity about what is at stake.
Musician and writer Adam Briggs asked the question at the heart of the debate:
“Why is it a radical notion that artists should be compensated for their work?”
“It would be an issue for them if someone wanted to steal their intellectual property. The hypocrisy is hard to fathom.”
Songwriter Holly Rankin exposed the logic behind demands for AI training exemptions:
“It’s like saying it’s an issue that I have to pay rent… It’s saying: ‘Your value is an issue to us.’”
She also issued a warning that has global relevance:
“A text and data mining exception would apply to all First Nations culture… You would be giving free, unfettered access… It’s an abominable proposition.”
This is not resistance to technology.
It is a call for consent, parity and respect.
The Theft We Pretend Not To See
Committee chair Senator Sarah Hanson Young drew attention to the ongoing pattern of cultural appropriation:
“We know there’s been a long history of stealing Aboriginal generated work… souvenirs that claim to be made by Indigenous communities that are made in China or Indonesia.”
If that is unacceptable in a gift shop, it is unacceptable in a dataset.
This is the mirror moment.
If companies would not tolerate their internal data being scraped to train an AI model, why should creative workers tolerate it for theirs?
A Billion Dollar Warning
In the United States, the consequences have already arrived.
Anthropic agreed to pay 1.5 billion US dollars to settle claims that it trained models on roughly 500,000 pirated books. Authors will receive about 3,000 dollars per work.
Reuters reported the outcome:
Anthropic must “destroy downloaded copies” of the copyrighted books.
And the lawyers representing authors issued a message, reported by Reuters:
“This settlement sends a powerful message… that taking copyrighted works from these pirate websites is wrong.”
This is not a creative quarrel.
It is supply chain governance.
It is data integrity.
Authenticity Is the Only Real Edge
AI can mimic tone. It can approximate style. But, as Briggs reminded the Senate, it cannot reproduce lived experience.
“AI [doesn’t] understand what a lounge room in Shepparton… smells like. It’s the innate, human quality of the art and the authenticity that we strive to create.”
On what is lost when AI imitates creators:
“Big tech can’t replicate that… you miss out on inspiring a whole new generation of young artists.”
Authenticity is not nostalgic.
It is a competitive advantage that cannot be stolen or scraped.
Cultural Custodianship Is Data Integrity
Leah Flanagan, Director of NATSIMO, explained why Indigenous Cultural and Intellectual Property presents unique vulnerabilities:
“We have recognised that Indigenous cultural and intellectual property has a value… What are we actually doing to ensure that there is some form of universality?”
She also warned about how language can be misused:
“When that word translates over to a copyright perspective, it takes on the meaning of public domain and royalty free.”
Creative data is not neutral.
It carries identity, context and meaning. AI systems do not understand these concepts, but they can erase them.
From Chaos to Architecture
The future of AI is not endless conflict.
It is infrastructure.
Australia already operates collective licensing systems through APRA AMCOS, Screenrights and the Copyright Agency. Extending these frameworks to AI training could turn uncertainty into clarity and conflict into contract.
Predictable licensing.
Micro royalties.
Traceable provenance.
A supply chain of trust.
Boards are already asking vendors to prove training data origins. Soon this will be mandatory.
What Leaders Should Do
Ask for verification.
Require training data provenance from every AI vendor.
Price creativity correctly.
Treat licensed data the same way you treat cloud usage and compute.
Rewrite contracts.
Demand warranties and indemnities for unlicensed materials.
Build defensible systems.
Assume your datasets will be examined by regulators, courts or customers.
Trust is becoming a market advantage.
Perspective
AI does not just learn from us. It learns about us.
Every dataset reveals what we value and what we are willing to take without permission.
Australia’s policy shift and the Anthropic settlement mark the beginning of a global transition from extraction to accountability.
In that world, the winners will not be the largest or the fastest.
They will be those who earn trust.
Just like the chocolate fish.
It is never free. It is given when someone does the right thing.
References
Brittain, B. (2025, September 5). Anthropic agrees to pay $1.5 billion to settle author class action. Reuters. https://www.reuters.com/sustainability/boards-policy-regulation/anthropic-agrees-pay-15-billion-settle-author-class-action-2025-09-05/
Brittain, B. (2025, September 9). Anthropic’s $1.5 billion copyright settlement faces judge’s scrutiny. Reuters. https://www.reuters.com/legal/government/anthropics-15-billion-copyright-settlement-faces-judges-scrutiny-2025-09-09/
Rowland, M. (2025, October 27). Questions without notice – Artificial Intelligence [Transcript]. Ministers’ office of the Australian Government. https://ministers.ag.gov.au/media-centre/transcripts/questions-without-notice-artificial-intelligence-27-10-2025
Albanese Government. (2025, October 26). Albanese Government to ensure Australia is prepared for future copyright challenges emerging from AI [Media release]. https://ministers.ag.gov.au/media-centre/albanese-government-ensure-australia-prepared-future-copyright-challenges-emerging-ai-26-10-2025
Sadler, D. (2025, October 27). Govt rules out AI copyright exemption for tech giants. iTnews. https://ia.acs.org.au/article/2025/govt-rules-out-ai-copyright-exemption-for-tech-giants.html
Truu, M. (2025, October 26). Federal government rules out changing copyright law to give AI companies free rein. ABC News. https://www.abc.net.au/news/2025-10-27/labor-rules-out-ai-training-copyright-exceptions/105935740


