• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail

On Lipschitz Regularization of Convolutional Layers using Toeplitz Matrix Theory

Araújo, Alexandre; Negrevergne, Benjamin; Chevaleyre, Yann; Atif, Jamal (2021), On Lipschitz Regularization of Convolutional Layers using Toeplitz Matrix Theory, 35th AAAI Conference on Artificial Intelligence, 2021-02, Vancouver, Canada

View/Open
2006.08391.pdf (920.7Kb)
Type
Communication / Conférence
Date
2021
Conference title
35th AAAI Conference on Artificial Intelligence
Conference date
2021-02
Conference city
Vancouver
Conference country
Canada
Metadata
Show full item record
Author(s)
Araújo, Alexandre
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Negrevergne, Benjamin
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Chevaleyre, Yann
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Atif, Jamal
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Abstract (EN)
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschitz regularity is now established as a key property of modern deep learning with implications in training stability, generalization, robustness against adversarial examples, etc. However, computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard. Recent attempts from the literature introduce upper bounds to approximate this constant that are either efficient but loose or accurate but computationally expensive. In this work, by leveraging the theory of Toeplitz matrices, we introduce a new upper bound for convolutional layers that is both tight and easy to compute. Based on this result we devise an algorithm to train Lipschitz regularized Convolutional Neural Networks.
Subjects / Keywords
Lipschitz; algorithm

Related items

Showing items related by title and author.

  • Thumbnail
    Training Compact Deep Learning Models for Video Classification Using Circulant Matrices 
    Araújo, Alexandre; Negrevergne, Benjamin; Chevaleyre, Yann; Atif, Jamal (2018) Communication / Conférence
  • Thumbnail
    On the Expressive Power of Deep Fully Circulant Neural Networks 
    Araújo, Alexandre; Negrevergne, Benjamin; Chevaleyre, Yann; Atif, Jamal (2019) Document de travail / Working paper
  • Thumbnail
    On the expressivity of bi-Lipschitz normalizing flows 
    Vérine, Alexandre; Negrevergne, Benjamin; Rossi, Fabrice; Chevaleyre, Yann (2022) Communication / Conférence
  • Thumbnail
    On the robustness of randomized classifiers to adversarial examples 
    Pinot, Rafaël; Meunier, Laurent; Yger, Florian; Gouy-Pailler, Cedric; Chevaleyre, Yann; Atif, Jamal (2022) Article accepté pour publication ou publié
  • Thumbnail
    Deep Learning for Metagenomic Data: using 2D Embeddings and Convolutional Neural Networks 
    Thanh Hai, Nguyen; Chevaleyre, Yann; Prifti, Edi; Sokolovska, Nataliya; Zucker, Jean-Daniel (2017) Communication / Conférence
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo