• xmlui.mirage2.page-structure.header.title
    • français
    • English
  • Help
  • Login
  • Language 
    • Français
    • English
View Item 
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
  •   BIRD Home
  • LAMSADE (UMR CNRS 7243)
  • LAMSADE : Publications
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

BIRDResearch centres & CollectionsBy Issue DateAuthorsTitlesTypeThis CollectionBy Issue DateAuthorsTitlesType

My Account

LoginRegister

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors
Thumbnail - No thumbnail

Feature selection with optimal coordinate ascent (OCA)

Saltiel, David; Benhamou, Eric (2018-12), Feature selection with optimal coordinate ascent (OCA). https://basepub.dauphine.fr/handle/123456789/18907

Type
Document de travail / Working paper
External document link
https://hal.archives-ouvertes.fr/hal-02012473
Date
2018-12
Publisher
Preprint Lamsade
Series title
Preprint Lamsade
Published in
Paris
Pages
15
Metadata
Show full item record
Author(s)
Saltiel, David
Laboratoire d'Informatique Signal et Image de la Côte d'Opale [LISIC]
Benhamou, Eric
Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision [LAMSADE]
Abstract (EN)
In machine learning, Feature Selection (FS) is a major part of efficient algorithm. It fuels the algorithm and is the starting block for our prediction. In this paper, we present a new method, called Optimal Coordinate Ascent (OCA) that allows us selecting features among block and individual features. OCA relies on coordinate ascent to find an optimal solution for gradient boosting methods score (number of correctly classified samples). OCA takes into account the notion of dependencies between variables forming blocks in our optimization. The coordinate ascent optimization solves the issue of the NP hard original problem where the number of combinations rapidly explode making a grid search unfeasible. It reduces considerably the number of iterations changing this NP hard problem into a polynomial search one. OCA brings substantial differences and improvements compared to previous coordinate ascent feature selection method: we group variables into block and individual variables instead of a binary selection. Our initial guess is based on the k-best group variables making our initial point more robust. We also introduced new stopping criteria making our optimization faster. We compare these two methods on our data set. We found that our method outperforms the initial one. We also compare our method to the Recursive Feature Elimination (RFE) method and find that OCA leads to the minimum feature set with the highest score. This is a nice byproduct of our method as it provides empirically the most compact data set with optimal performance.
Subjects / Keywords
feature selection; coordinate ascent; gradient boosting method

Related items

Showing items related by title and author.

  • Thumbnail
    Trade Selection with Supervised Learning and OCA 
    Saltiel, David; Benhamou, Eric (2018) Document de travail / Working paper
  • Thumbnail
    Sélection efficace de variables par descente par coordonnée avec garanties théoriques 
    Saltiel, David; Benhamou, Éric (2019) Document de travail / Working paper
  • Thumbnail
    Sélection efficace de variables par montée par coordonnée avec garanties théoriques 
    Saltiel, David; Benhamou, Éric (2019) Document de travail / Working paper
  • Thumbnail
    Detecting crisis event with Gradient Boosting Decision Trees 
    Benhamou, Éric; Ohana, Jean; Saltiel, David; Guez, Beatrice (2021) Document de travail / Working paper
  • Thumbnail
    Regime change detection with GBDT and Shapley values 
    Benhamou, Éric; Ohana, Jean; Saltiel, David; Guez, Beatrice (2021) Document de travail / Working paper
Dauphine PSL Bibliothèque logo
Place du Maréchal de Lattre de Tassigny 75775 Paris Cedex 16
Phone: 01 44 05 40 94
Contact
Dauphine PSL logoEQUIS logoCreative Commons logo