• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail - Request a copy

Geometry-aware principal component analysis for symmetric positive definite matrices

Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016), Geometry-aware principal component analysis for symmetric positive definite matrices, Machine Learning, p. 1-30. 10.1007/s10994-016-5605-5

Type
Article accepté pour publication ou publié
Date
2016
Journal name
Machine Learning
Pages
1-30
Publication identifier
10.1007/s10994-016-5605-5
Metadata
Show full item record
Author(s)
Horev, Inbal
University of Tokyo
Yger, Florian cc
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Sugiyama, Masashi
University of Tokyo
Abstract (EN)
Symmetric positive definite (SPD) matrices in the form of covariance matrices, for example, are ubiquitous in machine learning applications. However, because their size grows quadratically with respect to the number of variables, high-dimensionality can pose a difficulty when working with them. So, it may be advantageous to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data maximizes the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that (1) preserves more data variance by appropriately extending PCA to matrix data, and (2) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals and for texture image classification.
Subjects / Keywords
dimensionality reduction; PCA; Riemannian geometry; SPD manifold; Grassmann manifold

Related items

Showing items related by title and author.

  • Thumbnail
    Geometry-aware stationary subspace analysis 
    Horev, Inbal; Yger, Florian; Sugiyama, Masashi (2016) Communication / Conférence
  • Thumbnail
    Multitask Principal Component Analysis 
    Yamane, Ikko; Yger, Florian; Berar, Maxime; Sugiyama, Masashi (2016) Communication / Conférence
  • Thumbnail
    Ensemble learning based on functional connectivity and Riemannian geometry for robust workload estimation 
    Corsi, Marie-Constance; Chevallier, Sylvain; Barthélemy, Quentin; Hoxha, Isabelle; Yger, Florian Communication / Conférence
  • Thumbnail
    Riemannian Geometry on Connectivity for Clinical BCI 
    Corsi, Marie-Constance; Yger, Florian; Chevallier, Sylvain; Noûs, Camille (2021) Communication / Conférence
  • Thumbnail
    Geodesically-convex optimization for averaging partially observed covariance matrices 
    Yger, Florian; Chevallier, S.; Barthélemy, Q.; Suvrit, S. (2020) Communication / Conférence
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo