Differentially Private Neural Tangent Kernels (DP-NTK) for Privacy-Preserving Data Generation

  • Yilin Yang
  • , Kamil Adamczewski
  • , Xiaoxiao Li
  • , Danica J. Sutherland
  • , Mijung Park

Research output: Contribution to journalJournal articleResearchpeer-review

121 Downloads (Orbit)

Abstract

Maximum mean discrepancy (MMD) is a particularly useful distance metric for differentially private data generation: when used with finite-dimensional features, it allows us to summarize and privatize the data distribution once, which we can repeatedly use during generator training without further privacy loss. An important question in this framework is, then, what features are useful to distinguish between real and synthetic data distributions, and whether those enable us to generate quality synthetic data. This work considers using the features of neural tangent kernels (NTKs), more precisely empirical NTKs (e-NTKs). We find that, perhaps surprisingly, the expressiveness of the untrained e-NTK features is comparable to that of the features taken from pre-trained perceptual features using public data. As a result, our method improves the privacy-accuracy trade-off compared to other state-of-the-art methods, without relying on any public data, as demonstrated on several tabular and image benchmark datasets.
Original languageEnglish
JournalJournal of Artificial Intelligence Research
Volume81
Pages (from-to)683-700
ISSN1076-9757
DOIs
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Differentially Private Neural Tangent Kernels (DP-NTK) for Privacy-Preserving Data Generation'. Together they form a unique fingerprint.

Cite this