← Back to Glossary
AI Hallucination Exploit
An attack that exploits AI model hallucinations by registering domain names, package names, or resources that AI systems falsely recommend.
An attack that exploits AI model hallucinations by registering domain names, package names, or resources that AI systems falsely recommend.