scams are multiplying, how to recognize them?

As soon as a technology gains momentum, users attempt to misuse it for malicious purposes. If malicious uses had already been highlighted during the emergence of NFTs or the metaverse, it is now artificial intelligence – and in particular ChatGPT – which acts as a privileged playground for scams. Back on the various scams related to ChatGPT.

Artificial intelligence used for malicious purposes

Phishing campaigns generated using ChatGPT

Cybersecurity researchers at Norton Antivirus recently published a report on cyber risks related to ChatGPT. Experts note the use of the chatbot in phishing campaigns. Indeed, AI has the ability to write structured texts in many languages. Its use can thus make it possible to improve the content present in phishing emails, which are currently easily recognizable to a trained eye.

Code capabilities used for malware creation

According to Norton researchers, ChatGPT’s coding skills could also be used to create more sophisticated malware:

Some programming languages ​​are rarely used to create malware, and it’s easy to use ChatGPT to “translate” source code from one language to a less common one.

This ability of the chatbot to transfer code from one language to another would then make it possible to bypass certain layers of antivirus protection.

The association of ChatGPT with other AIs

Texts from ChatGPT can be coupled with other tools using artificial intelligence to produce particularly persuasive misleading content. Recently, videos made using AI video generator (like Synthesia), featuring realistic avatars, have been posted on YouTube. These videos, presented as tutorials for cracking paid software (including the Adobe suite) referred to malware. While this practice is not new, the use of artificial intelligence makes it harder to detect.

Premiere Pro Crack Tutorial
These fake tutorials feature an avatar whose lip movement is synchronized with the text. © CaptureBDM

ChatGPT name impersonation

Fake apps and extensions

Many scams also take advantage of ChatGPT’s strong popularity to usurp its name. Thus, a fake extension for Chrome has recently been talked about. Named Quick access to ChatGPT, the extension promised a shortcut to the chatbot, but was actually used to hijack Facebook accounts. It then allowed hackers to run advertisements for itself in order to spread, sometimes using credits from company accounts. The extension has since been taken down, after reaching over 2000 downloads per day. In addition, you can find a multitude of applications using the name of ChatGPT on the app stores, while OpenAI has not yet developed any.

ChatGPT Extensions
None of these applications are related to OpenAI. © BDM assembly

Cryptocurrency scams

Over the past few months, hundreds of cryptocurrency projects using the name ChatGPT – or Bing ChatGPT – have appeared on Binance, Ethereum, and Arbitrum blockchains to fool investors wanting to ride the trend. Even more elaborate: a fake chatbot posing as ChatGPT has also appeared. This made the Internet user dangle an investment in cryptocurrencies thanks to artificial intelligence. The purpose of the operation? Recover access to the target’s bank account.

How to spot scams using ChatGPT?

In the future, it may become increasingly difficult to spot AI-powered scams. This is why it will be necessary to redouble our vigilance. In case of doubt about textual content, tools exist to distinguish a text generated by ChatGPT. Others could emerge to help detect images or deepfakes.

As for uses of the name ChatGPT, be sure to check with a quick search that the tools are actually developed by OpenAI, and avoid using them if they’re not.