← Back to Glossary
Data Poisoning
An attack that compromises the integrity of a machine learning model by injecting malicious or misleading data into its training dataset.
An attack that compromises the integrity of a machine learning model by injecting malicious or misleading data into its training dataset.