• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail

A discrete version of CMA-ES

Benhamou, Eric; Atif, Jamal; Laraki, Rida (2018), A discrete version of CMA-ES. https://basepub.dauphine.fr/handle/123456789/18927

View/Open
main.pdf (256.3Kb)
Type
Document de travail / Working paper
Date
2018
Publisher
Preprint Lamsade
Series title
Preprint Lamsade
Published in
Paris
Pages
13
Metadata
Show full item record
Author(s)
Benhamou, Eric
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Atif, Jamal
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Laraki, Rida cc
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Abstract (EN)
Modern machine learning uses more and more advanced optimization techniques to find optimal hyper parameters. Whenever the objective function is non-convex, non continuous and with potentially multiple local minima, standard gradient descent optimization methods fail. A last resource and very different method is to assume that the optimum(s), not necessarily unique, is/are distributed according to a distribution and iteratively to adapt the distribution according to tested points. These strategies originated in the early 1960s, named Evolution Strategy (ES) have culminated with the CMA-ES (Covariance Matrix Adaptation) ES. It relies on a multi variate normal distribution and is supposed to be state of the art for general optimization program. However, it is far from being optimal for discrete variables. In this paper, we extend the method to multivariate binomial correlated distributions. For such a distribution, we show that it shares similar features to the multi variate normal: independence and correlation is equivalent and correlation is efficiently modeled by interaction between different variables. We discuss this distribution in the framework of the exponential family. We prove that the model can estimate not only pairwise interactions among the two variables but also is capable of modeling higher order interactions. This allows creating a version of CMA ES that can accomodate efficiently discrete variables. We provide the corresponding algorithm and conclude.
Subjects / Keywords
Covariance Matrix Adaptation; Evolution Strategy

Related items

Showing items related by title and author.

  • Thumbnail
    Deep Reinforcement Learning (DRL) for portfolio allocation 
    Benhamou, Éric; Saltiel, David; Ohana, Jean-Jacques; Atif, Jamal; Laraki, Rida Communication / Conférence
  • Thumbnail
    A short note on the operator norm upper bound for sub-Gaussian tailed random matrices 
    Benhamou, Eric; Atif, Jamal; Laraki, Rida (2019-01) Document de travail / Working paper
  • Thumbnail
    A new approach to learning in Dynamic Bayesian Networks (DBNs) 
    Benhamou, Eric; Atif, Jamal; Laraki, Rida (2018) Document de travail / Working paper
  • Thumbnail
    BCMA-ES: a conjugate prior Bayesian optimization view 
    Benhamou, Éric; Saltiel, David; Laraki, Rida; Atif, Jamal (2020) Document de travail / Working paper
  • Thumbnail
    NGO-GM: Natural Gradient Optimization for Graphical Models 
    Benhamou, Éric; Atif, Jamal; Laraki, Rida; Saltiel, David (2020) Document de travail / Working paper
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo