• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • View Item
  •   BIRD Home
  • CEREMADE (UMR CNRS 7534)
  • CEREMADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail

Sample Complexity of Sinkhorn divergences

Genevay, Aude; Chizat, Lenaic; Bach, Francis; Cuturi, Marco; Peyré, Gabriel (2019), Sample Complexity of Sinkhorn divergences, AISTATS'19 - 22nd International Conference on Artificial Intelligence and Statistics, 2019-04, Okinawa, Japan

View/Open
1810.02733.pdf (407.4Kb)
Type
Communication / Conférence
Date
2019
Conference title
AISTATS'19 - 22nd International Conference on Artificial Intelligence and Statistics
Conference date
2019-04
Conference city
Okinawa
Conference country
Japan
Pages
11
Metadata
Show full item record
Author(s)
Genevay, Aude
CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Chizat, Lenaic
CEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
Bach, Francis
Département d'informatique - ENS Paris [DI-ENS]
Cuturi, Marco
Graduate School of Informatics [Kyoto]
Peyré, Gabriel
Département de Mathématiques et Applications - ENS Paris [DMA]
Abstract (EN)
Optimal transport (OT) and maximum mean discrepancies (MMD) are now routinely used in machine learning to compare probability measures. We focus in this paper on \emph{Sinkhorn divergences} (SDs), a regularized variant of OT distances which can interpolate, depending on the regularization strength ε, between OT (ε=0) and MMD (ε=∞). Although the tradeoff induced by that regularization is now well understood computationally (OT, SDs and MMD require respectively O(n3logn), O(n2) and n2 operations given a sample size n), much less is known in terms of their \emph{sample complexity}, namely the gap between these quantities, when evaluated using finite samples \emph{vs.} their respective densities. Indeed, while the sample complexity of OT and MMD stand at two extremes, 1/n1/d for OT in dimension d and 1/n−−√ for MMD, that for SDs has only been studied empirically. In this paper, we \emph{(i)} derive a bound on the approximation error made with SDs when approximating OT as a function of the regularizer ε, \emph{(ii)} prove that the optimizers of regularized OT are bounded in a Sobolev (RKHS) ball independent of the two measures and \emph{(iii)} provide the first sample complexity bound for SDs, obtained,by reformulating SDs as a maximization problem in a RKHS. We thus obtain a scaling in 1/n−−√ (as in MMD), with a constant that depends however on ε, making the bridge between OT and MMD complete.
Subjects / Keywords
Sinkhorn divergences

Related items

Showing items related by title and author.

  • Thumbnail
    Learning Generative Models with Sinkhorn Divergences 
    Genevay, Aude; Peyré, Gabriel; Cuturi, Marco (2018) Communication / Conférence
  • Thumbnail
    Stochastic Optimization for Large-scale Optimal Transport 
    Genevay, Aude; Cuturi, Marco; Peyré, Gabriel; Bach, Francis (2016) Communication / Conférence
  • Thumbnail
    Gromov-Wasserstein Averaging of Kernel and Distance Matrices 
    Peyré, Gabriel; Cuturi, Marco; Solomon, Justin (2016) Communication / Conférence
  • Thumbnail
    Fast Optimal Transport Averaging of Neuroimaging Data 
    Gramfort, A.; Peyré, Gabriel; Cuturi, Marco (2015) Communication / Conférence
  • Thumbnail
    Scaling Algorithms for Unbalanced Transport Problems 
    Chizat, Lénaïc; Peyré, Gabriel; Schmitzer, Bernhard; Vialard, François-Xavier (2018) Article accepté pour publication ou publié
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo