← Back to Glossary

AI Red Teaming

The practice of systematically testing AI systems for vulnerabilities, biases, and failure modes using adversarial techniques and creative probing.

Related Terms