Data poisoning is a cyberattack where adversaries inject malicious or misleading data into AI training datasets. The goal is to corrupt their behavior and elicit skewed, biased, or harmful results. A related danger is creating backdoors for malicious exploitation of AI/ML systems.
These attacks are a significant concern for developers and organizations deploying artificial intelligence…