John Heidemann / Papers / Having your Privacy Cake and Eating it Too: Platform-supported Auditing of Social Media Algorithms for Public Interest

Having your Privacy Cake and Eating it Too: Platform-supported Auditing of Social Media Algorithms for Public Interest
Basileal Imana, Aleksandra Korolova and John Heidemann

Citation

Basileal Imana, Aleksandra Korolova and John Heidemann. Having your Privacy Cake and Eating it Too: Platform-supported Auditing of Social Media Algorithms for Public Interest. Computer Supported Cooperative Work (Minneapolis, Minnesota, USA, Oct. 2023), to appear. [DOI] [PDF] [alt PDF]

Abstract

Relevance estimators are algorithms used by major social media platforms to determine what content is shown to users and its presentation order. These algorithms aim to personalize the platforms’ experience for users, increasing engagement and, therefore, platform revenue. However, at the large scale of many social media platforms, many have concerns that the relevance estimation and personalization algorithms are opaque and can produce outcomes that are harmful to individuals or society. Legislations have been proposed in both the U.S. and the E.U. that mandate auditing of social media algorithms by external researchers. But auditing at scale risks disclosure of users’ private data and platforms’ proprietary algorithms, and thus far there has been no concrete technical proposal that can provide such auditing. Our goal is to propose a new method for platform-supported auditing that can meet the goals of the proposed legislations. The first contribution of our work is to enumerate these challenges and the limitations of existing auditing methods to implement these policies at scale. Second, we suggest that limited, privileged access to relevance estimators is the key to enabling generalizable platform-supported auditing of social media platforms by external researchers. Third, we show platform-supported auditing need not risk user privacy nor disclosure of platforms’ business interests by proposing an auditing framework that protects against these risks. For a particular fairness metric, we show that ensuring privacy imposes only a small constant factor increase (6.34x as an upper bound, and 4x for typical parameters) in the number of samples required for accurate auditing. Our technical contributions, combined with ongoing legal and policy efforts, can enable public oversight into how social media platforms affect individuals and society by moving past the privacy-vs-transparency hurdle.

Bibtex Citation

@inproceedings{Imana23a,
  author = {Imana, Basileal and Korolova, Aleksandra and Heidemann, John},
  title = {Having your Privacy Cake and Eating it Too: Platform-supported Auditing of Social Media Algorithms for Public Interest},
  booktitle = { Computer Supported Cooperative Work },
  year = {2023},
  sortdate = {2023-10-13},
  project = {ant},
  jsubject = {network_observation},
  pages = {to appear},
  month = oct,
  address = {Minneapolis, Minnesota, USA},
  publisher = {ACM},
  keywords = {linkedin, facebook, ad delivery algorithm, bias,
                    skew, discrimination, platform-supported auditing,
                    differential privacy},
  doi = {https://doi.org/10.1145/3579610},
  url = {https://ant.isi.edu/%7ejohnh/PAPERS/Imana23a.html},
  pdfurl = {https://ant.isi.edu/%7ejohnh/PAPERS/Imana23a.pdf},
  blogurl = {https://ant.isi.edu/blog/?p=1889}
}
Copyright © by John Heidemann