Приказ основних података о документу

dc.contributor.authorPetrović, Milena
dc.contributor.authorRakočević, Vladimir
dc.contributor.authorKontrec, Nataša
dc.contributor.authorPanić, Stefan
dc.contributor.authorIlić, Dejan
dc.date.accessioned2023-04-19T11:49:56Z
dc.date.available2023-04-19T11:49:56Z
dc.identifier.urihttps://platon.pr.ac.rs/handle/123456789/1226
dc.description.abstractWe present a gradient descent algorithm with a line search procedure for solving unconstrained optimization problems which is defined as a result of applying Picard-Mann hybrid iterative process on accelerated gradient descent S M method described in Stanimirović and Miladinović (Numer. Algor. 54, 503–520, 2010). Using merged features of both analyzed models, we show that new accelerated gradient descent model converges linearly and faster then the starting S M method which is confirmed trough displayed numerical test results. Three main properties are tested: number of iterations, CPU time and number of function evaluations. The efficiency of the proposed iteration is examined for the several values of the correction parameter introduced in Khan (2013).en_US
dc.language.isoen_USen_US
dc.publisherSpringeren_US
dc.titleHybridization of accelerated gradient descent methoden_US
dc.title.alternativeNumerical algorithmsen_US
dc.typeclanak-u-casopisuen_US
dc.description.versionpublishedVersionen_US
dc.identifier.doihttps://doi.org/10.1007/s11075-017-0460-4
dc.citation.volume79
dc.citation.spage769
dc.citation.epage786
dc.subject.keywordsLine searchen_US
dc.subject.keywordsGradient descent methodsen_US
dc.subject.keywordsNewton methoden_US
dc.subject.keywordsConvergence rateen_US
dc.type.mCategoryM21aen_US
dc.type.mCategoryclosedAccessen_US
dc.type.mCategoryM21aen_US
dc.type.mCategoryclosedAccessen_US


Документи

Thumbnail

Овај рад се појављује у следећим колекцијама

Приказ основних података о документу