AI Package Hallucination – Spreading Malicious Packages Using Generative AI

Bar Lanyado

Revolutionary research exposes new attack technique using ChatGPT! Discover how attackers could exploit its hallucination to spread malicious packages, posing a grave threat to developers and production systems.

This session will delve into my recent research of attacking technique of leveraging ChatGPT for spreading malicious packages. The presentation will cover the motivation behind undertaking this study, the research methodology employed, as well as a demonstration of a Proof of Concept (PoC).

The concept of AI Package Hallucination introduces a novel approach to spreading malicious packages using LLM hallucination. The technique involves querying ChatGPT (or other LLM applications) with various code-related inquiries and requests for package recommendations to address them. Then identify non-existent (hallucinated) packages and develop and release a malicious package under the same name. Subsequently, when a developer poses a similar question related to the subject, ChatGPT will respond with the previously mentioned package name, which now exists but carries malicious intent. In the research, we have shown that 30% of ChatGPT responses, included at least 1 hallucinated package.It is important to note that the hallucinations are repetitive, and I received the same package name on similar questions to the questions I have asked in the context of the attacker. In addition, I have checked the issue from different users / IPs to verify that the answers are not biased to a user. That means if a threat actor will use this technique in large numbers he will be able to find many potential packages to spread and increase his probability to infect developers or production systems with his malware.

Speaker's Bio

Bar is a security researcher at Lasso Security. For the past 6 years, he has worked as a penetration tester and security researcher. During his career, Bar has tested and researched various areas such as Mobile & Web applications, reversing, supply chain attacks, and more.