Headline
Alert: Scammers Pose as ChatGPT in New Phishing Scam
By Waqas This phishing scam exploits the popularity of the AI-based ChatGPT chatbot to steal funds and harvest the personal and financial details of users. This is a post from HackRead.com Read the original post: Alert: Scammers Pose as ChatGPT in New Phishing Scam
****The latest ChatGPT phishing scam is currently targeting users in Ireland, Australia, Germany, Denmark, and the Netherlands.****
Since its launch in November 2022, OpenAI’s AI-based chatbot, ChatGPT, has gripped the world due to its text-generating capabilities. Naturally, threat actors are looking for ways to exploit the immense stardom it has garnered in just four months. The latest discovery indicates one such attempt.
It is worth noting that cybercriminals have been eager to exploit ChatGPT to develop malware and carry out other malicious attacks, even including attempts to develop malware. On the other hand, there are several fake ChatGPT apps that are carrying out invasive data harvesting against Android and iOS users.
If you are interested in using the ChatGPT app on your Android, Hackread.com has tested the new “DialogueAI AI Chat Bot App.” This app legitimately uses the official API of OpenAI’s ChatGPT and does not collect any data from the user’s device.
Financial Phishing Scam Featuring Fake Version of ChatGPT
Cybersecurity vendor Bitdefender researchers have discovered a new phishing scam in which cybercriminals are redirecting unsuspecting users to a fake ChatGPT version. In this financial scam, the primary targets were found in the following countries:
- Ireland
- Australia
- Germany
- Denmark
- Netherlands
The modus operandi of this scam involves users receiving a scam email containing a link to ChatGPT, which is, of course, fake.
The fake and copycat ChatGPT used in the phishing scam (Image: Bitdefender)
According to Bitdefender, the scam is a “highly sophisticated financial scam” involving innovative lures, since previously, threat actors have mostly relied on weekly or monthly subscription-based lures.
How Does the Attack Work?
Bitdefender’s Antispam Labs report that in this new scam, the attackers send an unsolicited email with any of the following subject lines:
- ChatGPT: New AI bot has everyone going crazy about it
- ChatGPT: New AI bot has everyone in shock from it
- New ChatGPTchatbot is making everyone crazy now – but it’ll very soon be as mundane a tool as Google
- Why is all people panic about the ChatGPT bot?
The email itself offers insufficient details, and the recipient has to click on an embedded link for more information. This link leads them to a fake version of the ChatGPT chatbot, where they are asked to invest at least €250 and enter their banking card details, email address, ID credentials, and phone number.
Then, a copycat version of ChatGPT is delivered to the victim, which differs from the original chatbot as it offers a few pre-determined answers to the user’s queries. This chatbot was accessible only through a blacklisted domain (timegaeacom).
What Happens When Victims Access Fake ChatGPT?
The researchers at Bitdefender, headquartered in Bucharest, noticed that the fake ChatGPT first offered a brief introduction to how it could help them become successful investors. It asked for their email address for instant verification and their phone number for setting up a WhatsApp account for the promised investment.
Afterwards, they receive a call from in which scammers claim to represent a London-based firm called Import Capital. Speaking in Romanian, the rep asks victims to invest in crypto and international stock.
Further, they request crucial financial information like their permission to calculate their and their family members’ median daily salary, passive income sources, hours they daily spend working, and if they are satisfied with their current income.
They are then asked to invest €250 and provide the last six digits of their valid ID card. However, while digging deeper into the scam, Bitdefender researchers requested that a link to the investment portal be sent via email. A form was sent to them, in which they inserted a false code, so the investment was declined.
It is worth noting that the domain of Import Capital was never authorized for business in the UK, as per the alert from the FCA (Financial Conduct Authority).
This campaign is quickly expanding to other regions, so users must stay alert. That’s why it is necessary to use ChatGPT only through its official website.
Scammers acting as ChatGPT to offer financial advice (Image: Bitdefender)
Remember, as an AI language model, ChatGPT does not engage in any phishing scams. However, scammers are using the ChatGPT name to try to trick people into revealing personal information or clicking on malicious links. Here are some tips to stay safe:
- Be cautious of unsolicited emails or messages claiming to be from ChatGPT that ask for personal information, such as passwords or bank account numbers. ChatGPT will never ask for this information.
- Check the email address or website URL carefully to make sure they are legitimate. Scammers may create fake email addresses or websites that are similar to ChatGPT’s official ones.
- Do not click on links or download attachments from suspicious emails or messages. They may contain malware or viruses.
- Use anti-virus software and keep it up to date to protect your device from malware and viruses.
- If you suspect that you have received a phishing email or message, do not respond to it or click on any links. Instead, report it to the appropriate authorities, such as your company’s IT department or the Anti-Phishing Working Group.
- Google Introduces Bard: New ChatGPT Rival
- Bard AI Causes Google Losses of $100 Billion
- Coinbase Employees Hit by SMS Phishing Attack
- ARMO integrates ChatGPT to secure Kubernetes
- OpenAI’s ChatGPT Creates Polymorphic Malware
I am a UK-based cybersecurity journalist with a passion for covering the latest happenings in cyber security and tech world. I am also into gaming, reading and investigative journalism