← Back to Glossary
Model Backdoor
A hidden vulnerability inserted into a machine learning model during training that causes specific misclassification when a trigger pattern is present.
A hidden vulnerability inserted into a machine learning model during training that causes specific misclassification when a trigger pattern is present.