But when you add something new to an ML pipeline, there's new ways for adversaries to wreak havoc. We introduce a new type of data poisoning attack exploiting the dynamic nature of unlearn requests: a "camouflage" attack, which lies dormant until triggered by the adversary. 3/n
Camouflage Data Poisoning Attacks in ML Unlearning Systems
By
–
Leave a Reply