본문

서브메뉴

Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization.
Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization.

상세정보

자료유형  
 학위논문
Control Number  
0017161412
International Standard Book Number  
9798382842714
Dewey Decimal Classification Number  
519
Main Entry-Personal Name  
Jiang, Liwei.
Publication, Distribution, etc. (Imprint  
[S.l.] : Cornell University., 2024
Publication, Distribution, etc. (Imprint  
Ann Arbor : ProQuest Dissertations & Theses, 2024
Physical Description  
311 p.
General Note  
Source: Dissertations Abstracts International, Volume: 85-12, Section: B.
General Note  
Advisor: Davis, Damek.
Dissertation Note  
Thesis (Ph.D.)--Cornell University, 2024.
Summary, Etc.  
요약Optimization-based algorithms are the foundation for empirically successful methods in modern fields, such as artificial intelligence and data science. Although classical optimization theory provides guarantees for functions with smoothness or convexity, a significant portion of modern problems do not possess any of these. Despite the worst-case examples where efficient algorithms are unavailable, typical nonsmoothness arises with a "partly smooth" structure, meaning that they are well-behaved relative to a smooth "active manifold."This thesis develops and analyzes first-order algorithms based on the aforementioned nonsmooth structure. We first develop two regularity conditions describing how sub-gradients interact with active manifolds and then show that they hold for a broad and generic class of functions. With these cornerstones, we demonstrate that when randomly perturbed or equipped with stochastic noise, subgradient methods only converge to minimizers of generic, Clarke regular semialgebraic problems. When convergence to a certain minimizer is known, we demonstrate that stochastic (projected) subgradient methods have asymptotic normality, making them asymptotically optimal algorithms in the locally minimax sense of Hajek and Le Cam.These findings culminate with a new first-order algorithm-NTDescent-which exhibits local nearly linear convergence on typical nonsmooth functions with quadratic growth. The convergence rate of NTDescent depends only on the function's intrinsic quantities but not the problem's underlying dimension.
Subject Added Entry-Topical Term  
Applied mathematics.
Subject Added Entry-Topical Term  
Statistics.
Index Term-Uncontrolled  
Asymptotic normality
Index Term-Uncontrolled  
First-order method
Index Term-Uncontrolled  
Linear convergence
Index Term-Uncontrolled  
Nonsmooth optimization
Index Term-Uncontrolled  
Parameter-free
Index Term-Uncontrolled  
Saddle point avoidance
Added Entry-Corporate Name  
Cornell University Operations Research and Information Engineering
Host Item Entry  
Dissertations Abstracts International. 85-12B.
Electronic Location and Access  
로그인을 한후 보실 수 있는 자료입니다.
Control Number  
joongbu:658036

MARC

 008250224s2024        us  ||||||||||||||c||eng  d
■001000017161412
■00520250211151353
■006m          o    d                
■007cr#unu||||||||
■020    ▼a9798382842714
■035    ▼a(MiAaPQ)AAI31243432
■040    ▼aMiAaPQ▼cMiAaPQ
■0820  ▼a519
■1001  ▼aJiang,  Liwei.▼0(orcid)0009-0005-3287-9966
■24510▼aSaddle  Avoidance,  Asymptotic  Normality,  and  Exponential  Acceleration  in  Nonsmooth  Optimization.
■260    ▼a[S.l.]▼bCornell  University.  ▼c2024
■260  1▼aAnn  Arbor▼bProQuest  Dissertations  &  Theses▼c2024
■300    ▼a311  p.
■500    ▼aSource:  Dissertations  Abstracts  International,  Volume:  85-12,  Section:  B.
■500    ▼aAdvisor:  Davis,  Damek.
■5021  ▼aThesis  (Ph.D.)--Cornell  University,  2024.
■520    ▼aOptimization-based  algorithms  are  the  foundation  for  empirically  successful  methods  in  modern  fields,  such  as  artificial  intelligence  and  data  science.  Although  classical  optimization  theory  provides  guarantees  for  functions  with  smoothness  or  convexity,  a  significant  portion  of  modern  problems  do  not  possess  any  of  these.  Despite  the  worst-case  examples  where  efficient  algorithms  are  unavailable,  typical  nonsmoothness  arises  with  a  "partly  smooth"  structure,  meaning  that  they  are  well-behaved  relative  to  a  smooth  "active  manifold."This  thesis  develops  and  analyzes  first-order  algorithms  based  on  the  aforementioned  nonsmooth  structure.  We  first  develop  two  regularity  conditions  describing  how  sub-gradients  interact  with  active  manifolds  and  then  show  that  they  hold  for  a  broad  and  generic  class  of  functions.  With  these  cornerstones,  we  demonstrate  that  when  randomly  perturbed  or  equipped  with  stochastic  noise,  subgradient  methods  only  converge  to  minimizers  of  generic,  Clarke  regular  semialgebraic  problems.  When  convergence  to  a  certain  minimizer  is  known,  we  demonstrate  that  stochastic  (projected)  subgradient  methods  have  asymptotic  normality,  making  them  asymptotically  optimal  algorithms  in  the  locally  minimax  sense  of  Hajek  and  Le  Cam.These  findings  culminate  with  a  new  first-order  algorithm-NTDescent-which  exhibits  local  nearly  linear  convergence  on  typical  nonsmooth  functions  with  quadratic  growth.  The  convergence  rate  of  NTDescent  depends  only  on  the  function's  intrinsic  quantities  but  not  the  problem's  underlying  dimension.
■590    ▼aSchool  code:  0058.
■650  4▼aApplied  mathematics.
■650  4▼aStatistics.
■653    ▼aAsymptotic  normality
■653    ▼aFirst-order  method
■653    ▼aLinear  convergence
■653    ▼aNonsmooth  optimization
■653    ▼aParameter-free
■653    ▼aSaddle  point  avoidance
■690    ▼a0796
■690    ▼a0364
■690    ▼a0463
■71020▼aCornell  University▼bOperations  Research  and  Information  Engineering.
■7730  ▼tDissertations  Abstracts  International▼g85-12B.
■790    ▼a0058
■791    ▼aPh.D.
■792    ▼a2024
■793    ▼aEnglish
■85640▼uhttp://www.riss.kr/pdu/ddodLink.do?id=T17161412▼nKERIS▼z이  자료의  원문은  한국교육학술정보원에서  제공합니다.

미리보기

내보내기

chatGPT토론

Ai 추천 관련 도서


    New Books MORE
    Related books MORE
    최근 3년간 통계입니다.

    Buch Status

    • Reservierung
    • 캠퍼스간 도서대출
    • 서가에 없는 책 신고
    • Meine Mappe
    Sammlungen
    Registrierungsnummer callnumber Standort Verkehr Status Verkehr Info
    TQ0034358 T   원문자료 열람가능/출력가능 열람가능/출력가능
    마이폴더 부재도서신고

    * Kredite nur für Ihre Daten gebucht werden. Wenn Sie buchen möchten Reservierungen, klicken Sie auf den Button.

    해당 도서를 다른 이용자가 함께 대출한 도서

    Related books

    Related Popular Books

    도서위치