The Impact of Differential Privacy on Group Disparity Mitigation
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Fulltext
Final published version, 724 KB, PDF document
The performance cost of differential privacy has, for some applications, been shown to be higher for minority groups fairness, conversely, has been shown to disproportionally compromise the privacy of members of such groups. Most work in this area has been restricted to computer vision and risk assessment. In this paper, we evaluate the impact of differential privacy on fairness across four tasks, focusing on how attempts to mitigate privacy violations and between-group performance differences interact Does privacy inhibit attempts to ensure fairness? To this end, we train epsilon, delta-differentially private models with empirical risk minimization and group distributionally robust training objectives. Consistent with previous findings, we find that differential privacy increases between-group performance differences in the baseline setting but more interestingly, differential privacy reduces between-group performance differences in the robust setting. We explain this by reinterpreting differential privacy as regularization.
Original language | English |
---|---|
Title of host publication | Proceedings of the Fourth Workshop on Privacy in Natural Language Processing |
Number of pages | 14 |
Publisher | Association for Computational Linguistics |
Publication date | 2022 |
DOIs | |
Publication status | Published - 2022 |
Event | 4th Workshop on Privacy in Natural Language Processing - Seattle, United States, Seattle, United States Duration: 1 Jul 2022 → 1 Jul 2022 |
Conference
Conference | 4th Workshop on Privacy in Natural Language Processing |
---|---|
Location | Seattle, United States |
Land | United States |
By | Seattle |
Periode | 01/07/2022 → 01/07/2022 |
ID: 341493148