As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Users have the right to have their data deleted by third-party learned systems, as codified by recent legislation such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Such data deletion can be achieved by full re-training, but this incurs a high computational cost for modern machine learning methods. To avoid this cost, many approximate deletion methods have been developed for supervised learning. Unsupervised learning, in contrast, remains largely an open problem when it comes to efficient approximate data deletion. In this paper, we introduce (1) an efficient method for approximate deletion in generative models, and (2) statistical tests for estimating whether training points have been deleted. We provide theoretical guarantees under various learner assumptions. We then empirically demonstrate our methods across a variety of generative methods.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.