AI-Generated Code Risks: 'Package Hallucinations' Endanger Software Security with Fake Dependencies
April 29, 2025
A recent study highlights that AI-generated code poses significant security risks to the software supply chain, primarily due to 'package hallucinations' that reference non-existent third-party libraries.
Researchers analyzed 576,000 code samples from 16 popular large language models, uncovering that nearly 20% of package dependencies, amounting to 440,445, were fabricated.
Open source models exhibited a notably higher hallucination rate of approximately 22%, compared to just 5% for commercial models, indicating a greater vulnerability in open source software.
The study also found that 43% of hallucinated package names were repeated across multiple queries, making them predictable targets for attackers.
These hallucinations significantly increase the risk of dependency confusion attacks, where malicious packages are disguised as legitimate ones to infiltrate software.
Attackers can exploit AI-generated code by publishing malicious packages under fake names suggested by language models, leading to unverified installations by users.
Joseph Spracklen, a lead researcher from the University of Texas at San Antonio, emphasized the critical need for user verification before installing packages suggested by language models.
Dependency confusion was first demonstrated in 2021, successfully exploiting networks of major corporations like Apple, Microsoft, and Tesla, highlighting the real-world implications of these vulnerabilities.
Dependencies are crucial components that software relies on to function, and inaccuracies in these dependencies can lead to serious vulnerabilities.
Spracklen noted that attackers can exploit the hallucinations in AI-generated code by publishing malicious code under the fabricated names suggested by AI.
Overall, the findings underscore the importance of rigorous verification processes in software development to mitigate the risks associated with AI-generated code.
Summary based on 2 sources
Get a daily email with more AI stories
Sources

Ars Technica • Apr 28, 2025
AI-generated code could be a disaster for the software supply chain. Here’s why.
Slashdot • Apr 29, 2025
AI-Generated Code Creates Major Security Risk Through 'Package Hallucinations' - Slashdot