AdaSub: Stochastic Optimization Using Second-Order Information in Low-Dimensional Subspaces

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

15 Downloads (Pure)

Abstract

We introduce AdaSub, a stochastic optimization algorithm that computes a search direction based on second-order information in a low-dimensional subspace that is defined adaptively based on available current and past information. Compared to first-order methods, second-order methods exhibit better convergence characteristics, but the need to compute the Hessian matrix at each iteration results in excessive computational expenses, making them impractical. To address this issue, our approach enables the management of computational expenses and algorithm efficiency by enabling the selection of the subspace dimension for the search. Our code is freely available on GitHub, and our preliminary numerical results demonstrate that AdaSub surpasses popular stochastic optimizers in terms of time and number of iterations required to reach a given accuracy.
Original languageEnglish
Title of host publicationProceedings of the 2023 IEEE 10th International Conference on Data Science and Advanced Analytics (DSAA)
Number of pages7
PublisherIEEE
Publication date2023
ISBN (Print)979-8-3503-4504-9
ISBN (Electronic)979-8-3503-4503-2
DOIs
Publication statusPublished - 2023
Event2023 IEEE 10th International Conference on Data Science and Advanced Analytics - Thessaloniki, Greece
Duration: 9 Oct 202313 Oct 2023

Conference

Conference2023 IEEE 10th International Conference on Data Science and Advanced Analytics
Country/TerritoryGreece
CityThessaloniki
Period09/10/202313/10/2023

Keywords

  • Stochastic optimization
  • Subspace optimization
  • Low-dimensional optimization
  • Stochastic quasi-Newton methods

Fingerprint

Dive into the research topics of 'AdaSub: Stochastic Optimization Using Second-Order Information in Low-Dimensional Subspaces'. Together they form a unique fingerprint.

Cite this