본문

서브메뉴

Topics on Machine Learning Under Imperfect Supervision.
Topics on Machine Learning Under Imperfect Supervision.
Contents Info
Topics on Machine Learning Under Imperfect Supervision.
Material Type  
 학위논문
 
0017161295
Date and Time of Latest Transaction  
20250211151336
ISBN  
9798383162866
DDC  
310
Author  
Yuan, Gan.
Title/Author  
Topics on Machine Learning Under Imperfect Supervision.
Publish Info  
[S.l.] : Columbia University., 2024
Publish Info  
Ann Arbor : ProQuest Dissertations & Theses, 2024
Material Info  
128 p.
General Note  
Source: Dissertations Abstracts International, Volume: 85-12, Section: A.
General Note  
Advisor: Kpotufe, Samory;Zheng, Tian.
학위논문주기  
Thesis (Ph.D.)--Columbia University, 2024.
Abstracts/Etc  
요약This dissertation comprises several studies addressing supervised learning problems where the supervision is imperfect.Firstly, we investigate the margin conditions in active learning. Active learning is characterized by its special mechanism where the learner can sample freely over the feature space and exploit mostly of the limited labelling budget by querying the most informative labels. Our primary focus is to discern critical conditions under which certain active learning algorithms can outperform the optimal passive learning minimax rate. Within a non-parametric multi-class classification framework, our results reveal that the uniqueness of Bayes labels across the feature space serves as the pivotal determinant for the superiority of active learning over passive learning.Secondly, we study the estimation of central mean subspace (CMS), and its application in transfer learning. We show that a fast parametric convergence rate of form Cd · n −1/2 is achievable via estimating the expected smoothed gradient outer product, for a general class of covarite distribution that admits Gaussian or heavier distributions. When the link function is a polynomial with a degree of at most r and the covariates follows the standard Gaussian, we show that the prefactor depends on the ambient dimension d as Cd ∝ d r . Furthermore, we show that under a transfer learning setting, an oracle rate of prediction error as if the CMS is known is achievable, when the source training data is abundant.Finally, we present an innovative application involving the utilization of weak (noisy) labels for addressing an Individual Tree Crown (ITC) segmentation challenge. Here, the objective is to delineate individual tree crowns within a 3D LiDAR scan of tropical forests, with only 2D noisy manual delineations of crowns on RGB images available as a source of weak supervision. We propose a refinement algorithm designed to enhance the performance of existing unsupervised learning methodologies for the ITC segmentation problem.
Subject Added Entry-Topical Term  
Statistics.
Subject Added Entry-Topical Term  
Computer science.
Subject Added Entry-Topical Term  
Information science.
Index Term-Uncontrolled  
Active learning
Index Term-Uncontrolled  
Imperfect supervision
Index Term-Uncontrolled  
Machine learning
Index Term-Uncontrolled  
Sufficient dimension reduction
Index Term-Uncontrolled  
Weakly supervised learning
Added Entry-Corporate Name  
Columbia University Statistics
Host Item Entry  
Dissertations Abstracts International. 85-12A.
Electronic Location and Access  
로그인을 한후 보실 수 있는 자료입니다.
Control Number  
joongbu:658679
New Books MORE
최근 3년간 통계입니다.

Detail Info.

  • Reservation
  • 캠퍼스간 도서대출
  • 서가에 없는 책 신고
  • My Folder
Material
Reg No. Call No. Location Status Lend Info
TQ0034997 T   원문자료 열람가능/출력가능 열람가능/출력가능
마이폴더 부재도서신고

* Reservations are available in the borrowing book. To make reservations, Please click the reservation button

해당 도서를 다른 이용자가 함께 대출한 도서

Related books

Related Popular Books

도서위치