Warning over fictitious Centrelink payments
Artificial intelligence is being used to generate false notices about ‘bonuses’ from the government agency.
Facebook boss sacks fact-checkers
Meta, the company that owns Facebook, Instagram, Threads, and WhatsApp, is to eliminate fact-checking in favour of a “community notes” system.
Company chief executive Mark Zuckerberg said on 7 January 2025 that the company would stop using independent fact-checkers, beginning in the United States.
Citing “free speech” as a motivation, Zuckerberg said the fact checkers had become “too politically biased” and had “destroyed more trust than they’ve created, especially in the US”.
Instead, Facebook will follow the example of Elon Musk’s X (formerly Twitter) and use “community notes”, where individuals can fact-check or contextualise posts. These will be attached to contentious posts if enough users find them helpful.
Everybody should exercise caution online. Malicious people may use social media, websites, podcasts, and video platforms to spread misinformation, while other users may post or repost false information believing (or wishing) it to be true.
A lot of what appears on social media is opinion rather than fact. As individuals, we need to be discerning and evaluate who we can trust and who may be trying to mislead us.
NSA always strives to be a reliable source of information for older Australians.
If something sounds too good to be true it often is, but unscrupulous publishers are now using Artificial Intelligence (AI) to trick people into thinking they are going to receive even modest pension bonuses.
At this time of cost-of-living pressure, people are keen to hear about anything that might help their household budgets. Unfortunately, some people are taking advantage of this by spreading misleading information online.
Services Australia warns that there are unofficial websites and social media accounts promoting false information about a “one off payment”, “Centrelink cash relief payment”, or “bonus payment” relating to the cost-of-living, the Age Pension, or having a concession card. The size of these fabricated payments varies but those mentioned include amounts of $750 and $1,800.
This issue has been raised by National Seniors members, and it appears that these fabricated pages are using AI to generate the misleading articles.
These articles aren’t technically scams because they aren’t seeking your money or personal information. Instead, it seems the business model is to create these articles very cheaply and then to make money from advertising.
We have chosen not to link to the websites in question, as doing so could make it more likely that they will appear prominently in search engines.
Much of the focus has been on the ability of AI to make up convincing images and video. But also important is how it produces text.
AI can be “trained” on large amounts of text, including websites. In a similar way to phones suggesting potential next words when we are typing a text message, AI models use statistics to model how words and phrases relate to each other, based on vast amounts of information.
But just because AI has a vast amount of information at its disposal doesn’t mean the text it produces is correct or current.
In one concerning situation in the United States, a lawyer used popular AI chatbot ChatGPT to help write filings to a court. Unfortunately for them, several of the cases cited in their brief didn’t actually exist.
Also, AI can combine originally true information in ways that makes it incorrect. With questions like when certain payments are made, all it takes is for the AI to change a couple of numbers in the date to make up a payment.
For instance, if an article about the $750 COVID-era bonus and the January 2025 indexation of some Centrelink payments are fed into AI, it could then generate an article saying people are getting a $750 bonus in January 2025. If no one checks the information, or doesn’t care, this can then mislead people online.
AI is also seemingly being used to make misleading videos about Centrelink bonus payments that are not real. These videos commonly involve shiny, almost-lifelike images, using advanced editing techniques such as panning and zooming, instead of the stilted movement and robotic voiceovers with odd speech patterns and mispronunciations or phrasing that we used to associate with AI.
Search engines are increasingly implementing and supporting AI. In fact, research has indicated that search engines may preference lower-quality AI generated articles, which could include misinformation, over authoritative sources.
Searches vary by search engine, for instance, the top result on Google for “age pension bonus payment January 2025” is the Department of Social Services indexation rates for January 2025, which is an authoritative source.
But the same search on Bing has as its first result a misleading article claiming the Age Pension rates increased on 1 January 2025. This is despite the fact that Age Pension rates are indexed in March and September, not January.
The website claims that the Age Pension rate for a single person is “$941.10, which includes a base rate of $927.00 and an Energy Supplement of $14.10”, but these are the current transitional rates, not the normal rates that are relevant to most people.
People may not even need to click on a link to be shown misleading information. Bing, for instance, will sometimes show fake information about bonus Centrelink payments in the search results, depending on the search term.
For example, a search on Bing for “age pension bonus payment 2024” currently produces information highlighted about a false $400 pension bonus payment, which Bing places above the websites for Services Australia.
While it appears that search engines have somewhat improved on their displaying of such information, and YouTube has taken down some channels with these misleading videos, the situation could become worse as AI technology becomes more powerful.
As with many things, Experience Matters. It is important to be wary of scams and misinformation and stay informed using reputable sources.
Related reading: BBC, Webis.de, The Conversation