ChatGPT and other mainstream LLMs sparked a revolution in generative AI this year. But their safeguards against misuse left an opening for alternative LLMs designed specifically to boost cyberattacks. Tools like WormGPT and FraudGPT emerged on the dark web, offering AI-powered capabilities to automate phishing, gather intelligence on victims, and generate malware. These tools make it easier for unsophisticated hackers to launch attacks by generating persuasive phishing emails or custom malware code. Here's my article on SecurityIntelligence telling you all you need to know.