서브메뉴
검색
Sparsification, Online Optimization, and an Almost-Linear Time Algorithm for Maximum Flow- [electronic resource]
Sparsification, Online Optimization, and an Almost-Linear Time Algorithm for Maximum Flow- [electronic resource]
상세정보
- Material Type
- 학위논문
- 0016934520
- Date and Time of Latest Transaction
- 20240214101621
- ISBN
- 9798380482769
- DDC
- 515
- Author
- Liu, Yang.
- Title/Author
- Sparsification, Online Optimization, and an Almost-Linear Time Algorithm for Maximum Flow - [electronic resource]
- Publish Info
- [S.l.] : Stanford University., 2023
- Publish Info
- Ann Arbor : ProQuest Dissertations & Theses, 2023
- Material Info
- 1 online resource(302 p.)
- General Note
- Source: Dissertations Abstracts International, Volume: 85-04, Section: B.
- General Note
- Advisor: Sidford, Aaron;Vondrak, Jan;Charikar, Moses.
- 학위논문주기
- Thesis (Ph.D.)--Stanford University, 2023.
- Restrictions on Access Note
- This item must not be sold to any third party vendors.
- Abstracts/Etc
- 요약This thesis designs efficient algorithms for combinatorial and continuous optimization problems. By combining new techniques developed in this thesis pertaining to iterative methods, dynamic algorithms, and high-dimensional geometry, we give an almost-linear-time algorithm that computes an exact maximum flow and minimum cost flows in directed graphs. Our insights lead to algorithms for broader classes of combinatorial and continuous optimization problems including online discrepancy minimization, regression, and dynamic maxflow.In the first part of the thesis we study iterative methods, which are procedures which solve an optimization problem through a sequence of simpler subproblems. By understanding the geometry of the underlying optimization problem, we give iterative methods with improved convergence rates and runtimes for the problems of ℓpregression, maximum flow, and mincost flow.In the second part of the thesis we study dynamic algorithms and online optimization which naturally arise from studying iterative methods. In particular, we give the improved algorithms that maintain an approximate maximum flow in dynamic graphs undergoing edge insertions, and also nearly-optimal algorithms for minimizing the discrepancy of vectors arriving online.In the third part of the thesis, we study sparsification problems motivated by high-dimensional geometry. The goal of sparsification is to reduce the size of an objective while approximately maintaining the desired properties, such as the value on all inputs. We give improved and nearlyoptimal size bounds for sparsification of several objectives including spectral hypergraph energies and sums of symmetric submodular functions. Finally, we give nearly-linear size bounds for sparsification of generalized linear models satisfying natural bounds on their growth, which imply nearly-optimal algorithms for solving ℓp-regression for p ∈ (1, 2].This thesis contains material from the papers [KLS20,ALS21,LSS22,CKL+22, JLS22b, JLS22a, BLS23, JLLS23b, JLLS23a].
- Subject Added Entry-Topical Term
- Graphs.
- Subject Added Entry-Topical Term
- Iterative methods.
- Subject Added Entry-Topical Term
- Convex analysis.
- Subject Added Entry-Topical Term
- Design.
- Subject Added Entry-Topical Term
- Linear programming.
- Subject Added Entry-Topical Term
- Energy.
- Subject Added Entry-Topical Term
- Geometry.
- Subject Added Entry-Topical Term
- Mathematics.
- Added Entry-Corporate Name
- Stanford University.
- Host Item Entry
- Dissertations Abstracts International. 85-04B.
- Host Item Entry
- Dissertation Abstract International
- Electronic Location and Access
- 로그인을 한후 보실 수 있는 자료입니다.
- 소장사항
-
202402 2024
- Control Number
- joongbu:643562
MARC
008240221s2023 ulk 00 kor■001000016934520
■00520240214101621
■006m o d
■007cr#unu||||||||
■020 ▼a9798380482769
■035 ▼a(MiAaPQ)AAI30615100
■035 ▼a(MiAaPQ)STANFORDtn858wg4267
■040 ▼aMiAaPQ▼cMiAaPQ
■0820 ▼a515
■1001 ▼aLiu, Yang.
■24510▼aSparsification, Online Optimization, and an Almost-Linear Time Algorithm for Maximum Flow▼h[electronic resource]
■260 ▼a[S.l.]▼bStanford University. ▼c2023
■260 1▼aAnn Arbor▼bProQuest Dissertations & Theses▼c2023
■300 ▼a1 online resource(302 p.)
■500 ▼aSource: Dissertations Abstracts International, Volume: 85-04, Section: B.
■500 ▼aAdvisor: Sidford, Aaron;Vondrak, Jan;Charikar, Moses.
■5021 ▼aThesis (Ph.D.)--Stanford University, 2023.
■506 ▼aThis item must not be sold to any third party vendors.
■520 ▼aThis thesis designs efficient algorithms for combinatorial and continuous optimization problems. By combining new techniques developed in this thesis pertaining to iterative methods, dynamic algorithms, and high-dimensional geometry, we give an almost-linear-time algorithm that computes an exact maximum flow and minimum cost flows in directed graphs. Our insights lead to algorithms for broader classes of combinatorial and continuous optimization problems including online discrepancy minimization, regression, and dynamic maxflow.In the first part of the thesis we study iterative methods, which are procedures which solve an optimization problem through a sequence of simpler subproblems. By understanding the geometry of the underlying optimization problem, we give iterative methods with improved convergence rates and runtimes for the problems of ℓpregression, maximum flow, and mincost flow.In the second part of the thesis we study dynamic algorithms and online optimization which naturally arise from studying iterative methods. In particular, we give the improved algorithms that maintain an approximate maximum flow in dynamic graphs undergoing edge insertions, and also nearly-optimal algorithms for minimizing the discrepancy of vectors arriving online.In the third part of the thesis, we study sparsification problems motivated by high-dimensional geometry. The goal of sparsification is to reduce the size of an objective while approximately maintaining the desired properties, such as the value on all inputs. We give improved and nearlyoptimal size bounds for sparsification of several objectives including spectral hypergraph energies and sums of symmetric submodular functions. Finally, we give nearly-linear size bounds for sparsification of generalized linear models satisfying natural bounds on their growth, which imply nearly-optimal algorithms for solving ℓp-regression for p ∈ (1, 2].This thesis contains material from the papers [KLS20,ALS21,LSS22,CKL+22, JLS22b, JLS22a, BLS23, JLLS23b, JLLS23a].
■590 ▼aSchool code: 0212.
■650 4▼aGraphs.
■650 4▼aIterative methods.
■650 4▼aConvex analysis.
■650 4▼aDesign.
■650 4▼aLinear programming.
■650 4▼aEnergy.
■650 4▼aGeometry.
■650 4▼aMathematics.
■690 ▼a0791
■690 ▼a0389
■690 ▼a0405
■71020▼aStanford University.
■7730 ▼tDissertations Abstracts International▼g85-04B.
■773 ▼tDissertation Abstract International
■790 ▼a0212
■791 ▼aPh.D.
■792 ▼a2023
■793 ▼aEnglish
■85640▼uhttp://www.riss.kr/pdu/ddodLink.do?id=T16934520▼nKERIS▼z이 자료의 원문은 한국교육학술정보원에서 제공합니다.
■980 ▼a202402▼f2024
미리보기
내보내기
chatGPT토론
Ai 추천 관련 도서
Detail Info.
- Reservation
- 캠퍼스간 도서대출
- 서가에 없는 책 신고
- My Folder