top of page

Meta to Combat Misinformation Ahead of Australia's Federal Elections

  • Writer: paolo bibat
    paolo bibat
  • Mar 19
  • 2 min read

Meta Platforms, the parent company of Facebook and Instagram, has announced new measures to combat misinformation and deepfakes ahead of Australia's federal election, set to take place by May 2025.


The company aims to address concerns about election integrity by bolstering its independent fact-checking program in the country and implementing strict policies to curb misleading content.


In a blog post, Meta outlined its strategy to remove content that poses risks of violence, physical harm, or interference with voting processes. Cheryl Seeto, Meta's Head of Policy in Australia, emphasized that flagged content will be labeled with warnings and have its visibility reduced across feeds to limit its reach.


"When content is debunked by fact-checkers, we attach warning labels to it and reduce its distribution in Feed and Explore so it is less likely to be seen," Seeto said.


Meta has partnered with Agence France-Presse (AFP) and the Australian Associated Press (AAP) to review content for accuracy. These organizations will play a critical role in identifying false information and ensuring compliance with Meta's misinformation policies.


In addition to tackling fake news, Meta is intensifying efforts to combat deepfakes—AI-generated videos, images, or audio designed to mimic real content. Deepfake material that violates Meta's guidelines will either be removed or labeled as "altered" before being ranked lower in users' feeds.


Users sharing AI-generated content will also be prompted to disclose its origins. "For content that doesn't violate our policies, we still believe it's important for people to know when photorealistic content they're seeing has been created using AI," Seeto added.


The measures come as opinion polls show a tightly contested election in Australia, with the opposition Liberal-National coalition holding a narrow lead over the ruling Labor Party. Meta’s approach in Australia mirrors its efforts during recent elections in India, the United Kingdom, and the United States, where it implemented similar safeguards against misinformation.


However, these actions are taking place against a backdrop of regulatory challenges for Meta in Australia. The government is considering imposing a levy on major tech companies like Meta to compensate for advertising revenue generated from sharing local news content.


Additionally, social media platforms are required to enforce a ban on users under 16 by the end of 2025, with ongoing consultations on how these restrictions will be implemented.


Meta’s renewed focus on misinformation in Australia follows its controversial decision earlier this year to end third-party fact-checking programs in the United States. Instead, the company introduced a user-driven "Community Notes" system inspired by X (formerly Twitter).


While this move was framed as an effort to promote free expression and reduce censorship, critics have raised concerns about its potential impact on election integrity globally.


As Australia prepares for its upcoming election, Meta's enhanced safeguards aim to strike a balance between free expression and accountability.


Whether these measures will effectively curb misinformation remains to be seen, but they highlight the growing role of social media platforms in shaping public discourse during critical democratic events.

bottom of page