A founding member of the Alannah & Madeline Foundation’s eSmart Media Literacy Lab’s Advisory Group is also part of an independent board to supervise and oversee a voluntary code of conduct for social media technology companies that’s aimed at reducing online misinformation and disinformation.
Dr Anne Kruger is the Asia Pacific director for First Draft, a global not-for-profit whose mission is to protect communities from harmful misinformation.
She first became involved in the creation of the voluntary Australian Code of Practice on Disinformation and Misinformation as an external advisor to DIGI – a not-for-profit industry association advocating for the digital sector in Australia.
The code has been adopted by Facebook, Google, Twitter, Apple, Microsoft, Adobe, TikTok and RedBubble which requires them to tell users what measures they have in place to stop the spread of misinformation and disinformation on their services and to provide annual transparency reports detailing their efforts.
DIGI has appointed an independent complaints sub-committee to resolve public complaints about possible breaches of the code.
Dr Kruger is a member, along with consumer director at Communications Compliance, Victoria Rubensohn AM and consumer advocate, Christopher Zinn.
She first became involved when she wrote a background paper to the code, giving the Australian context to the key issues of misinformation and disinformation.
“As part of this, I worked with experts who understood human rights, the types of minority groups that are often targeted, and academic researchers who understand data, journalism and algorithms,” Dr Kruger said.
The code was developed in consultation with the Australian Communications and Media Authority (ACMA), and with the public through a public consultation process.
“The next stage was the governance and what the code encompassed, which is the independent oversight for the public to report breaches by signatories of their code commitments – in other words, a public complaints facility,” she said.
Any complaints from the public must be first be reviewed by DIGI, then by Dr Kruger and her colleagues.
If it’s decided that a platform hasn’t complied with the code, punishments will range from public statements, investigations and withdrawing a company from the code.
“However, it’s in the platform’s best interest to make it look like they’re working with stakeholders on solutions to keep society safe – and they can do this though making the most of the code”, Dr Kruger said.
Under the code, misinformation is defined as false or misleading information that is likely to cause harm, while disinformation is false or misleading information that is distributed by users through spam and bots, or other kinds of manipulative aggressive bulk behaviors.
The Foundation’s Media Literacy Lab is a way to help young people learn about disinformation and/or misinformation, Dr Kruger said.
“I don’t know if you can fully eradicate misinformation or disinformation, but you can get faster at reacting to it.
“It’s a long process to educate about media literacy, which is why I’m loving the Media Literacy Lab. This current generation is essential to bringing about change.”
Misinformation is damaging communities around the world, she said.
“The Media Literacy Lab is a crucial component to empowering our young people with knowledge and tools to build resilience to harmful, false or misleading information.
“This regulatory code is not enough. We’re always going to need a multi-pronged approach to ensuring our children and generations after can decipher what are the facts and how to find legitimate sources.”
Offering parents and children a specific “length of time” to spend on screens or playing games is not as beneficial as providing strategies and approaches to develop healthy gaming habits.Read More >
Our top 10 cyber safety tips to keep you and your family safe online.Read More >
Subscribe to our newsletter below or visit our media centre for information including media releases, spokespeople, publications and contacts.