본문

서브메뉴

Topics on Machine Learning Under Imperfect Supervision.
내용보기
Topics on Machine Learning Under Imperfect Supervision.
자료유형  
 학위논문
Control Number  
0017161295
International Standard Book Number  
9798383162866
Dewey Decimal Classification Number  
310
Main Entry-Personal Name  
Yuan, Gan.
Publication, Distribution, etc. (Imprint  
[S.l.] : Columbia University., 2024
Publication, Distribution, etc. (Imprint  
Ann Arbor : ProQuest Dissertations & Theses, 2024
Physical Description  
128 p.
General Note  
Source: Dissertations Abstracts International, Volume: 85-12, Section: A.
General Note  
Advisor: Kpotufe, Samory;Zheng, Tian.
Dissertation Note  
Thesis (Ph.D.)--Columbia University, 2024.
Summary, Etc.  
요약This dissertation comprises several studies addressing supervised learning problems where the supervision is imperfect.Firstly, we investigate the margin conditions in active learning. Active learning is characterized by its special mechanism where the learner can sample freely over the feature space and exploit mostly of the limited labelling budget by querying the most informative labels. Our primary focus is to discern critical conditions under which certain active learning algorithms can outperform the optimal passive learning minimax rate. Within a non-parametric multi-class classification framework, our results reveal that the uniqueness of Bayes labels across the feature space serves as the pivotal determinant for the superiority of active learning over passive learning.Secondly, we study the estimation of central mean subspace (CMS), and its application in transfer learning. We show that a fast parametric convergence rate of form Cd · n −1/2 is achievable via estimating the expected smoothed gradient outer product, for a general class of covarite distribution that admits Gaussian or heavier distributions. When the link function is a polynomial with a degree of at most r and the covariates follows the standard Gaussian, we show that the prefactor depends on the ambient dimension d as Cd ∝ d r . Furthermore, we show that under a transfer learning setting, an oracle rate of prediction error as if the CMS is known is achievable, when the source training data is abundant.Finally, we present an innovative application involving the utilization of weak (noisy) labels for addressing an Individual Tree Crown (ITC) segmentation challenge. Here, the objective is to delineate individual tree crowns within a 3D LiDAR scan of tropical forests, with only 2D noisy manual delineations of crowns on RGB images available as a source of weak supervision. We propose a refinement algorithm designed to enhance the performance of existing unsupervised learning methodologies for the ITC segmentation problem.
Subject Added Entry-Topical Term  
Statistics.
Subject Added Entry-Topical Term  
Computer science.
Subject Added Entry-Topical Term  
Information science.
Index Term-Uncontrolled  
Active learning
Index Term-Uncontrolled  
Imperfect supervision
Index Term-Uncontrolled  
Machine learning
Index Term-Uncontrolled  
Sufficient dimension reduction
Index Term-Uncontrolled  
Weakly supervised learning
Added Entry-Corporate Name  
Columbia University Statistics
Host Item Entry  
Dissertations Abstracts International. 85-12A.
Electronic Location and Access  
로그인을 한후 보실 수 있는 자료입니다.
Control Number  
joongbu:658679
신착도서 더보기
최근 3년간 통계입니다.

소장정보

  • 예약
  • 캠퍼스간 도서대출
  • 서가에 없는 책 신고
  • 나의폴더
소장자료
등록번호 청구기호 소장처 대출가능여부 대출정보
TQ0034997 T   원문자료 열람가능/출력가능 열람가능/출력가능
마이폴더 부재도서신고

* 대출중인 자료에 한하여 예약이 가능합니다. 예약을 원하시면 예약버튼을 클릭하십시오.

해당 도서를 다른 이용자가 함께 대출한 도서

관련도서

관련 인기도서

도서위치