Phishing

New Phishing Attacks Using ChatGPT to Develop Sophisticated Campaigns

Phishing has been one of the greatest threats to organizations, growing year after year. Phishing attacks have contributed to 90% of data breaches in the past few years, which makes cybercriminals adapt to them, making their attacks much more successful.

Zscaler has published a report indicating an increase of 47.2% in global phishing attacks. These include smishing (SMS), Vishing (VoIP), emails, Adversary-in-the-middle (AiTM, used to bypass Multi-factor authentication), and Phishing-as-a-Service (PaaS)-based attacks.

Since the COVID-19 pandemic, businesses have adapted to remote working, giving threat actors a much larger attack surface to conduct their criminal activities.

Due to business purposes, organizations have been using several communication methods like email, SMS, voice communications, etc., 

However, cybercriminals target and exploit every communication method, resulting in ransomware attacks or data breaches. As per reports, the most targeted industries are 

  1. Education (25.1%)
  2. Finance and insurance (16.6%)
  3. Government (13.8%)
  4. Other (10.5%)
  5. Health Care (8.9%)
  6. Manufacturing (8.8%)
  7. Retail Wholesale (6.4%)
  8. Services (5.7%)
  9. Technology communication (4.1%)

Compared to 2022, attacks on the education industry have increased by a massive amount of 576%, whereas retail and wholesale have dropped by 67% compared to 2021.

Zscaler stated that these attacks are based on analyzing 280 billion everyday transactions and 8 billion blocked attacks.

Other targets include Microsoft (41.4%), OneDrive (23.4%), Sharepoint (5.1%), Binance (crypto exchange, 23.4%), and other illegal streaming services (6.7%).

Imitated brands

The report also stated that these threat actors had used phishing kits and chatbot AI tools like ChatGPT. AI tools are being manipulated into creating sophisticated phishing campaigns cybercriminals use to bypass several security measures.

To Top

Pin It on Pinterest

Share This