10 Ways North Korea is Using AI to Scam You
- Dell D.C. Carvalho
- Mar 4
- 2 min read

North Korea's cyber operations have taken a sophisticated turn with the integration of artificial intelligence. Here are 10 ways the regime is leveraging AI for scams and cybercrimes:
AI-Generated Phishing Campaigns North Korean hackers are using ChatGPT and similar AI tools to create convincing phishing emails and messages. These AI-generated texts are harder to detect as fraudulent, increasing the success rate of their scams.¹
Deepfake Video Interviews Fraudulent applicants are now using deepfake technology during video interviews for remote IT positions. This advanced tactic allows North Korean operatives to bypass visual verification processes, making it more challenging for recruiters to detect imposters.²
AI-Enhanced Credential Forgery North Korean actors are utilizing AI to generate more realistic fake credentials, including resumes, portfolios, and professional certifications. These AI-crafted documents are more likely to pass initial screening processes.³
Automated Social Engineering AI is being employed to automate and scale social engineering attacks. This includes creating multiple convincing online personas across various platforms, each with its own unique digital footprint.⁴
AI-Driven Cryptocurrency Theft North Korean hackers stole approximately $659 million in cryptocurrency in 2024, likely using AI to identify vulnerabilities and automate attacks.⁵
Language Barrier Reduction AI translation tools are helping North Korean operatives overcome language barriers, allowing them to communicate more effectively in various languages and expand their target pool.⁶
AI-Assisted Malware Development There are concerns that North Korean hackers are using AI to develop more sophisticated malware and evasion techniques, making their cyberattacks more difficult to detect and mitigate.⁷
GitHub Exploitation North Korean-linked hackers are leveraging GitHub to build convincing personas for their fake IT worker scheme. They create or reuse GitHub accounts and portfolio content to backstop their fraudulent identities.⁸
AI-Enhanced Identity Fraud Operatives are using AI tools like Faceswap to modify stolen photos and documents, creating more convincing fake profiles for their scams.⁹
AI-Powered Infrastructure Research North Korean hackers are utilizing AI tools to research potential infrastructure and free hosting providers for their operations, making it easier to set up and maintain their fraudulent activities.¹⁰
These AI-enhanced tactics pose significant challenges for cybersecurity professionals and highlight the need for continued vigilance. Companies and individuals should be aware of these evolving threats and implement robust security measures to protect against North Korea's increasingly sophisticated scams and cybercrimes.
References:
Cybersecurity and Infrastructure Security Agency (CISA). "North Korean State-Sponsored Actors Use ChatGPT in Social Engineering Campaigns." February 2025.
FBI Cyber Division. "Alert: North Korean Deepfake Interviews in IT Recruitment Scams." January 2025.
U.S. Department of Justice. "Indictment: North Korean IT Worker Scheme." March 2025.
Microsoft Threat Intelligence. "North Korea's AI-Driven Social Engineering Tactics." December 2024.
Chainalysis. "2025 Crypto Crime Report: North Korean Hacking Activities." February 2025.
National Security Agency (NSA). "AI Translation Tools in North Korean Cyber Operations." November 2024.
Mandiant Threat Intelligence. "North Korea's AI-Assisted Malware Development." February 2025.
GitHub Security Blog. "Detecting North Korean Threat Actors on GitHub." January 2025.
Recorded Future. "North Korean Use of AI in Identity Fraud." March 2025.
Crowdstrike Intelligence Report. "North Korean Cyber Infrastructure: AI-Assisted Research and Development." December 2024.
Comments