본문

서브메뉴

Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization.
Содержание
Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization.
자료유형  
 학위논문
Control Number  
0017161412
International Standard Book Number  
9798382842714
Dewey Decimal Classification Number  
519
Main Entry-Personal Name  
Jiang, Liwei.
Publication, Distribution, etc. (Imprint  
[S.l.] : Cornell University., 2024
Publication, Distribution, etc. (Imprint  
Ann Arbor : ProQuest Dissertations & Theses, 2024
Physical Description  
311 p.
General Note  
Source: Dissertations Abstracts International, Volume: 85-12, Section: B.
General Note  
Advisor: Davis, Damek.
Dissertation Note  
Thesis (Ph.D.)--Cornell University, 2024.
Summary, Etc.  
요약Optimization-based algorithms are the foundation for empirically successful methods in modern fields, such as artificial intelligence and data science. Although classical optimization theory provides guarantees for functions with smoothness or convexity, a significant portion of modern problems do not possess any of these. Despite the worst-case examples where efficient algorithms are unavailable, typical nonsmoothness arises with a "partly smooth" structure, meaning that they are well-behaved relative to a smooth "active manifold."This thesis develops and analyzes first-order algorithms based on the aforementioned nonsmooth structure. We first develop two regularity conditions describing how sub-gradients interact with active manifolds and then show that they hold for a broad and generic class of functions. With these cornerstones, we demonstrate that when randomly perturbed or equipped with stochastic noise, subgradient methods only converge to minimizers of generic, Clarke regular semialgebraic problems. When convergence to a certain minimizer is known, we demonstrate that stochastic (projected) subgradient methods have asymptotic normality, making them asymptotically optimal algorithms in the locally minimax sense of Hajek and Le Cam.These findings culminate with a new first-order algorithm-NTDescent-which exhibits local nearly linear convergence on typical nonsmooth functions with quadratic growth. The convergence rate of NTDescent depends only on the function's intrinsic quantities but not the problem's underlying dimension.
Subject Added Entry-Topical Term  
Applied mathematics.
Subject Added Entry-Topical Term  
Statistics.
Index Term-Uncontrolled  
Asymptotic normality
Index Term-Uncontrolled  
First-order method
Index Term-Uncontrolled  
Linear convergence
Index Term-Uncontrolled  
Nonsmooth optimization
Index Term-Uncontrolled  
Parameter-free
Index Term-Uncontrolled  
Saddle point avoidance
Added Entry-Corporate Name  
Cornell University Operations Research and Information Engineering
Host Item Entry  
Dissertations Abstracts International. 85-12B.
Electronic Location and Access  
로그인을 한후 보실 수 있는 자료입니다.
Control Number  
joongbu:658036
New Books MORE
최근 3년간 통계입니다.

Подробнее информация.

  • Бронирование
  • 캠퍼스간 도서대출
  • 서가에 없는 책 신고
  • моя папка
материал
Reg No. Количество платежных Местоположение статус Ленд информации
TQ0034358 T   원문자료 열람가능/출력가능 열람가능/출력가능
마이폴더 부재도서신고

* Бронирование доступны в заимствований книги. Чтобы сделать предварительный заказ, пожалуйста, нажмите кнопку бронирование

해당 도서를 다른 이용자가 함께 대출한 도서

Related books

Related Popular Books

도서위치