← Back to Glossary

AI Jailbreaking

Techniques used to bypass the safety restrictions and content policies of AI systems, causing them to produce outputs they are designed to refuse.

Related Terms