← Back to Glossary
Prompt Injection
An attack against AI language models where malicious instructions are embedded in input to override the model intended behavior or extract sensitive data.
An attack against AI language models where malicious instructions are embedded in input to override the model intended behavior or extract sensitive data.