Social media has a social responsibility ... that's why we are making big changes to hold platforms to account for user safety,
This is a landmark reform. We know some kids will find workarounds, but we're sending a message to social media companies to clean up their act,
This is about protecting young people, not punishing or isolating them, and letting parents know that we’re in their corner when it comes to supporting their children’s health and wellbeing,
For too many young Australians, social media can be harmful. Almost two-thirds of 14- to 17-years-old Australians have viewed extremely harmful content online including drug abuse, suicide or self-harm as well as violent material. One quarter have been exposed to content promoting unsafe eating habits,
We are not saying risks don’t exist on messaging apps or online gaming. While users can still be exposed to harmful content by other users, they do not face the same algorithmic curation of content and psychological manipulation to encourage near-endless engagement,
This bill seeks to set a new normative value in society that accessing social media is not the defining feature of growing up in Australia,
There is wide acknowledgement that something must be done in the immediate term to help prevent young teens and children from being exposed to streams of content unfiltered and infinite,
Further, the inclusion of messaging apps could have wider consequences, such as making communication within families harder,
We are very prepared to go through having a process of criteria and seeing how this fits against it
What people are less keen on is having to go through ID check and verifications to access the internet generally or to do things online generally,
This is not about government mandating any form of technology or demanding any personal information be handed over to social media companies,
I think if people understand the risk and the check is carried out close to that risk, then I think people generally are OK ... We don’t want our children to be exposed to extreme violent video games or to pornography or to suicide material or to things that are going to cause them problems with their mental development such as body dysmorphia and weight loss and stuff like that,
None of these methods is 100% accurate,
The move effectively bumped up the severity of the intimate image abuse sharing offence within the Online Safety Act, so platforms have to be proactive in removing the content and prevent it from appearing in the first place,
What I want to do is look at the evidence,
There are assumptions about the impact [social media] has on children and young people, but there is no firm, peer reviewed evidence,
They had to proactively demonstrate to our regulator Ofcom that the algorithms would prevent that material going on in the first place. And if an image did appear online they needed to be taken down as fast as reasonably could be expected after being alerted,
The opposition is the only party arguing that people should upload 100 points of ID and give it to TikTok,
It’s one area where you can see that harm is being prevented, rather than actually getting out into society and then us dealing with it afterwards — which is what was happening before,
The legislation places the onus on social media platforms, not parents or children, to ensure protections are in place,