서브메뉴
검색
Efficient Natural Language Processing for Language Models.
Efficient Natural Language Processing for Language Models.
상세정보
- 자료유형
- 학위논문
- Control Number
- 0017160292
- International Standard Book Number
- 9798382223490
- Dewey Decimal Classification Number
- 004
- Main Entry-Personal Name
- Xu, Canwen.
- Publication, Distribution, etc. (Imprint
- [S.l.] : University of California, San Diego., 2024
- Publication, Distribution, etc. (Imprint
- Ann Arbor : ProQuest Dissertations & Theses, 2024
- Physical Description
- 150 p.
- General Note
- Source: Dissertations Abstracts International, Volume: 85-10, Section: B.
- General Note
- Advisor: McAuley, Julian.
- Dissertation Note
- Thesis (Ph.D.)--University of California, San Diego, 2024.
- Summary, Etc.
- 요약Despite achieving state-of-the-art performance on many NLP tasks, the high energy cost and long inference delay prevent Transformer-based language models (LMs) from seeing broader adoption including for edge and mobile computing. Our efficient NLP research aims to comprehensively consider computation, time and carbon emission for the entire life-cycle of NLP, including data preparation, model training and inference.We demonstrate ways to promote computational efficiency in natural language processing, thus reducing hardware and software bottlenecks of training and inference, which is crucial in applying such models in production. Efficient NLP further facilitates democratization of language technology and allows language models to be accessible to more people.
- Subject Added Entry-Topical Term
- Computer science.
- Subject Added Entry-Topical Term
- Statistics.
- Subject Added Entry-Topical Term
- Information technology.
- Index Term-Uncontrolled
- Natural language processing
- Index Term-Uncontrolled
- Language models
- Index Term-Uncontrolled
- Computational efficiency
- Index Term-Uncontrolled
- Data efficiency
- Index Term-Uncontrolled
- Early exit
- Added Entry-Corporate Name
- University of California, San Diego Computer Science and Engineering
- Host Item Entry
- Dissertations Abstracts International. 85-10B.
- Electronic Location and Access
- 로그인을 한후 보실 수 있는 자료입니다.
- Control Number
- joongbu:657509
MARC
008250224s2024 us ||||||||||||||c||eng d■001000017160292
■00520250211150951
■006m o d
■007cr#unu||||||||
■020 ▼a9798382223490
■035 ▼a(MiAaPQ)AAI30993169
■040 ▼aMiAaPQ▼cMiAaPQ
■0820 ▼a004
■1001 ▼aXu, Canwen.
■24510▼aEfficient Natural Language Processing for Language Models.
■260 ▼a[S.l.]▼bUniversity of California, San Diego. ▼c2024
■260 1▼aAnn Arbor▼bProQuest Dissertations & Theses▼c2024
■300 ▼a150 p.
■500 ▼aSource: Dissertations Abstracts International, Volume: 85-10, Section: B.
■500 ▼aAdvisor: McAuley, Julian.
■5021 ▼aThesis (Ph.D.)--University of California, San Diego, 2024.
■520 ▼aDespite achieving state-of-the-art performance on many NLP tasks, the high energy cost and long inference delay prevent Transformer-based language models (LMs) from seeing broader adoption including for edge and mobile computing. Our efficient NLP research aims to comprehensively consider computation, time and carbon emission for the entire life-cycle of NLP, including data preparation, model training and inference.We demonstrate ways to promote computational efficiency in natural language processing, thus reducing hardware and software bottlenecks of training and inference, which is crucial in applying such models in production. Efficient NLP further facilitates democratization of language technology and allows language models to be accessible to more people.
■590 ▼aSchool code: 0033.
■650 4▼aComputer science.
■650 4▼aStatistics.
■650 4▼aInformation technology.
■653 ▼aNatural language processing
■653 ▼aLanguage models
■653 ▼aComputational efficiency
■653 ▼aData efficiency
■653 ▼aEarly exit
■690 ▼a0984
■690 ▼a0489
■690 ▼a0800
■690 ▼a0463
■71020▼aUniversity of California, San Diego▼bComputer Science and Engineering.
■7730 ▼tDissertations Abstracts International▼g85-10B.
■790 ▼a0033
■791 ▼aPh.D.
■792 ▼a2024
■793 ▼aEnglish
■85640▼uhttp://www.riss.kr/pdu/ddodLink.do?id=T17160292▼nKERIS▼z이 자료의 원문은 한국교육학술정보원에서 제공합니다.
미리보기
내보내기
chatGPT토론
Ai 추천 관련 도서
Info Détail de la recherche.
- Réservation
- 캠퍼스간 도서대출
- 서가에 없는 책 신고
- My Folder