top of page
Charlotte Wilkes

Misinformation threatens upcoming election: lessons from the Voice Referendum

As Australia prepares for next year’s federal election, fact-checking groups brace for a surge of misinformation, reflecting trends seen during the 2023 Voice referendum, which failed one year ago today.


A social media post flagged as "False Information" by Meta. Photo credit: Wachiwit - stock.adobe.com

The referendum, which sought to enshrine an Indigenous Voice to Parliament in the constitution, was marred by a surge of disinformation, much of which targeted both the potential impact of the Voice and the electoral process itself.


Experts warn that the upcoming election could face similar, if not greater, challenges.


Sushi Das, Associate Director of RMIT FactLab: a leading fact-checking institution in Australia, described the wave of misinformation encountered during the referendum as unprecedented in its scale and impact.


"Last year, we fact-checked mostly about the referendum. We saw a significant spike in false misinformation,” said Das.


"When we were looking at the Voice misinformation, it fell into two categories: disinformation about the impact the Voice would have, and disinformation about the electoral process", she said.


According to Das, misinformation about the Voice referendum included false claims of special rights for Indigenous people, introduce a third chamber for Indigenous peoples in Parliament, and the idea that the Uluru Statement From The Heart  was 26 pages long with hidden policies.


These claims were false, yet widely spread, fuelled by social media and other platforms.


The disinformation was not only misleading but also damaging “to democratic processes,” Das said.


"It erodes trust in the voting system. It was harmful to social cohesiveness because people were pitted against each other based on inaccurate information, fuelling a polarised national debate", she said.


The disinformation was not always aimed at swaying voters in one direction but was often designed to create confusion and fear.


"Disinformation doesn't always aim to make people behave in a certain way. Sometimes, it simply aims to cause uncertainty, confusion, and chaos," said Das.


“This fear and uncertainty feed into the idea of a post-truth world, where people rely more on what they emotionally feel rather than on factual, evidence-based information," she said.


With the federal election fast approaching, the fact-checking community is anticipating similar tactics, especially in the realm of electoral disinformation.


Gordon Farrer, RMIT University Lecturer and Chief Investigator for RMIT ABC Fact Check unit highlighted this concern: "We expect to see more about electoral processes. In the last election, we saw misinformation spread about voting, and the Australian Electoral Commission (AEC) had to tackle it in real time."


False claims during the Voice referendum, such as the suggestion that not voting would automatically count as a "yes" vote, or that the referendum itself was constitutionally illegal, contributed to widespread confusion and distrust.


More extreme theories, like the baseless claim that American Dominion voting machines would be used to rig the results, reflected a broader global trend of electoral disinformation.

Farrer believes that this kind of misinformation poses a direct threat to democracy.


"It helps people become more fearful or less trustful of the voting system itself and of democracy in general," he said.


He also expressed concern over the role of social media in amplifying disinformation, noting that platforms like Meta can inadvertently contribute to the spread of harmful content.


RMIT FactLab has been working in partnership with Meta since 2022, focusing on identifying and fact-checking false information circulating on social media.


Das explained that the process involves a combination of traditional journalistic skills and digital verification tools.


Once a piece of content is fact-checked, Meta uses its technology to label the content as false and algorithmically down-rank it, making it less likely to be seen by users.


"Meta might blur out the post, but they do not remove it," she said.


While these measures help reduce the spread of false information, Farrer believes that more needs to be done.


"Even if Meta down-ranks it, it doesn't necessarily mean the correction will be seen by the people who initially saw the misinformation. And even if it is seen, many people stick to their original beliefs," he said.


Despite these challenges, RMIT FactLab and other fact-checking bodies remain committed to their work.


They are already preparing for the “onslaught” of misinformation expected in the lead-up to the federal election.


Farrer emphasised the need for fact-checkers to continue putting accurate information into the public domain while encouraging journalists to investigate and expose the networks behind disinformation campaigns.


"There is a big role for journalists to explain the nature of the ecosystem of misinformation and disinformation — how it works, who's behind it," Farrer said.


"This helps inoculate people from being sucked in by it", he said.


As Australia moves toward another major electoral event, the battle against misinformation will undoubtedly intensify.


Fact-checkers, journalists, and responsible media outlets will play a crucial role in defending the truth in an era increasingly defined by disinformation.

 




Comments


bottom of page