AI Dynamics

Global AI News Aggregator

Slopsquatting: AI Hallucinations Enable New Supply Chain Attack

URGENT PSA – New supply chain attack vector that I found WILD > AI LLMs hallucinate package names roughly 18-21% of the time. Hackers have started pre-registering those hallucinated names on PyPI and npm with malicious payloads; they call it "slopsquatting" You can only imagine what's next Community note: The 'slopsquatting' attack vector was documented as early as April 2025 and not newly discovered. The cited 18-21% package hallucination rate applies to open-source LLMs; commercial models average 5.2% according to the referenced study using pre-2025 models. socket.dev/blog/slopsquat… arxiv.org/pdf/2406.10279

→ View original post on X — @alexjc, 2026-04-02 12:13 UTC

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *