The Impact of Differential Privacy on Group Disparity Mitigation

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Documents

  • Fulltext

    Final published version, 724 KB, PDF document

  • Victor Petren Bach Hansen
  • Atula Tejaswi Neerkaje
  • Ramit Sawhney
  • Lucie Flek
  • Søgaard, Anders
The performance cost of differential privacy has, for some applications, been shown to be higher for minority groups fairness, conversely, has been shown to disproportionally compromise the privacy of members of such groups. Most work in this area has been restricted to computer vision and risk assessment. In this paper, we evaluate the impact of differential privacy on fairness across four tasks, focusing on how attempts to mitigate privacy violations and between-group performance differences interact Does privacy inhibit attempts to ensure fairness? To this end, we train epsilon, delta-differentially private models with empirical risk minimization and group distributionally robust training objectives. Consistent with previous findings, we find that differential privacy increases between-group performance differences in the baseline setting but more interestingly, differential privacy reduces between-group performance differences in the robust setting. We explain this by reinterpreting differential privacy as regularization.
Original languageEnglish
Title of host publicationProceedings of the Fourth Workshop on Privacy in Natural Language Processing
Number of pages14
PublisherAssociation for Computational Linguistics
Publication date2022
DOIs
Publication statusPublished - 2022
Event4th Workshop on Privacy in Natural Language Processing - Seattle, United States, Seattle, United States
Duration: 1 Jul 20221 Jul 2022

Conference

Conference4th Workshop on Privacy in Natural Language Processing
LocationSeattle, United States
LandUnited States
BySeattle
Periode01/07/202201/07/2022

ID: 341493148