Show simple item record

hal.structure.identifierLaboratoire Jacques-Louis Lions [LJLL]
hal.structure.identifierCEntre de REcherches en MAthématiques de la DEcision [CEREMADE]
dc.contributor.authorAndreev, Roman
dc.date.accessioned2018-02-16T13:59:25Z
dc.date.available2018-02-16T13:59:25Z
dc.date.issued2016
dc.identifier.urihttps://basepub.dauphine.fr/handle/123456789/17409
dc.language.isoenen
dc.subjectfinite elementsen
dc.subjectSmolyaken
dc.subjectsparse gridsen
dc.subjectneural networksen
dc.subjectstochastic eigenvaluesen
dc.subjectuncertainty quantificationen
dc.subject.ddc515en
dc.titleLearning stochastic eigenvaluesen
dc.typeDocument de travail / Working paper
dc.description.abstractenWe train an artificial neural network with one hidden layer on realizations of the first few eigenvalues of a partial differential operator that is parameterized by a vector of independent random variables. The eigenvalues exhibit "crossings" in the high-dimensional parameter space. The training set is constructed by sampling the parameter either at random nodes or at the Smolyak collocation nodes. The performance of the neural network is evaluated empirically on a large random test set. We find that training on random or quasi-random nodes is preferable to the Smolyak nodes. The neural network outperforms the Smolyak interpolation in terms of error bias and variance on nonsimple eigenvalues but not on the simple ones.en
dc.identifier.citationpages5en
dc.relation.ispartofseriestitlecahier de recherche CEREMADE- Paris-Dauphineen
dc.subject.ddclabelAnalyseen
dc.description.ssrncandidatenonen
dc.description.halcandidatenonen
dc.description.readershiprechercheen
dc.description.audienceInternationalen
hal.author.functionaut


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record