서브메뉴
검색
Beyond Maximum Likelihood: Distribution-Aware Machine Learning.
Beyond Maximum Likelihood: Distribution-Aware Machine Learning.
- Material Type
- 학위논문
- 0017164808
- Date and Time of Latest Transaction
- 20250211153049
- ISBN
- 9798346380221
- DDC
- 300
- Author
- Cundy, Christopher James.
- Title/Author
- Beyond Maximum Likelihood: Distribution-Aware Machine Learning.
- Publish Info
- [S.l.] : Stanford University., 2024
- Publish Info
- Ann Arbor : ProQuest Dissertations & Theses, 2024
- Material Info
- 164 p.
- General Note
- Source: Dissertations Abstracts International, Volume: 86-05, Section: B.
- General Note
- Advisor: Ermon, Stefano.
- 학위논문주기
- Thesis (Ph.D.)--Stanford University, 2024.
- Abstracts/Etc
- 요약Traditional machine learning approaches typically rely on maximum likelihood estimation (MLE), due to its ease of implementation and equivalence to KL-divergence minimization. However, models which are only trained to maximize likelihood often lack properties which are desired in deployment, such as quantification of uncertainty, robustness to out-of-distribution inputs, or adherence to privacy constraints. As machine learning models are deployed ever more widely, these important properties are more necessary than ever. Unfortunately, approaches that are able to provide these properties are often difficult to implement with today's large models and datasets.In this dissertation, we present several contributions to improve the tractability of methods that go beyond maximum likelihood. First, we improve Bayesian machine learning in several domains. This allows us to recover a full posterior distribution over parameters of interest, instead of the point estimates given by maximum likelihood methods. Secondly, we implement novel training schemes in sequential tasks: reinforcement learning and sequence modelling. In the reinforcement learning case, this allows us to develop reward-maximizing policies that do not disclose private information. In the sequence modelling case, we are able to implement novel divergences that lead to improved text generation.Our contributions allow us to scale distribution-aware methods to achieve state-of-the-art results in several domains, including recovering posteriors over causal graphs, developing privacy-aware algorithms in simulated robotics tasks, and generating human-like text with billion-parameter language models.
- Subject Added Entry-Topical Term
- Privacy.
- Added Entry-Corporate Name
- Stanford University.
- Host Item Entry
- Dissertations Abstracts International. 86-05B.
- Electronic Location and Access
- 로그인을 한후 보실 수 있는 자료입니다.
- Control Number
- joongbu:654302
Detail Info.
- Reservation
- 캠퍼스간 도서대출
- 서가에 없는 책 신고
- My Folder