What began as a taskforce in 2015 focused on protecting children from cyberbullying has evolved into the world’s first government agency dedicated to online safety.
Led by Julie Inman Grant, the eSafety Commissioner is Australia’s independent regulator for online harm.
Its role is to implement and enforce online safety laws, investigate complaints and hold digital platforms accountable for the content they host.
The agency can compel companies to remove harmful or illegal material, issue formal notices and monitor whether platforms are meeting their obligations. It can also enforce the law by issuing financial penalties. For example, last year, the eSafety Commissioner fined Telegram almost $1 million for failing to respond to a reporting deadline.
The eSafety Commissioner also introduced world-first industry standards requiring digital platforms to take proactive steps to reduce spread of illegal content and age-restricted material.
Most recently, the regulator made headlines for its involvement in creating world-first social media laws.
PROTECTING FUTURE GENERATIONS
In 2025, Australia introduced laws requiring social media users to be at least 16 years old.
The legislation was designed to prevent underage users from accessing platforms such as TikTok, Instagram and Snapchat, placing responsibility on companies to enforce age limits.
The policy aims to protect younger users from potentially harmful social media content and limit the impact of notifications or alerts on sleep, stress levels and attention.
Credit: Adem AY via Unsplash
The move made global headlines. Other governments are now considering implementing similar laws.
The eSafety Commissioner helped guide policy makers in creating the laws, but the decision wasn’t made without consideration.
The agency played an advisory role, drawing on its research capabilities and ongoing consultation to help inform policy design.
AI ON THE AGENDA
Artificial intelligence is next under the spotlight.
The eSafety Commissioner has raised concerns about AI-driven companions or chatbots, with research finding some systems were failing to protect children from sexually explicit content and failing to refer users expressing self-harm or suicidal thoughts to appropriate support services.
There are also concerns about how AI might affect vulnerable groups, including older Australians, with the systems becoming more human-like and persuasive.
As the capabilities of digital technology like AI continue to advance, so do the risks.
The eSafety Commissioner’s role is only expected to grow, keeping Australians safe in an evolving online world.