서브메뉴
검색
Towards Secure and Robust 3K Perception in the Real World: An Adversarial Approach.
Towards Secure and Robust 3K Perception in the Real World: An Adversarial Approach.
- 자료유형
- 학위논문
- Control Number
- 0017163775
- International Standard Book Number
- 9798342117197
- Dewey Decimal Classification Number
- 629.13252
- Main Entry-Personal Name
- Cheng, Zhiyuan.
- Publication, Distribution, etc. (Imprint
- [S.l.] : Purdue University., 2024
- Publication, Distribution, etc. (Imprint
- Ann Arbor : ProQuest Dissertations & Theses, 2024
- Physical Description
- 198 p.
- General Note
- Source: Dissertations Abstracts International, Volume: 86-04, Section: B.
- General Note
- Advisor: Zhang, Xiangyu;Celik, Z. Berkay;Li, Pan;Zhang, Tianyi.
- Dissertation Note
- Thesis (Ph.D.)--Purdue University, 2024.
- Summary, Etc.
- 요약The advent of advanced machine learning and computer vision techniques has led to the feasibility of 3D perception in the real world, which includes but not limited to tasks of monocular depth estimation (MDE), 3D object detection, semantic scene completion, optical flow estimation (OFE), etc. Due to the 3D nature of our physical world, these techniques have enabled various real-world applications like Autonomous Driving (AD), unmanned aerial vehicle (UAV), virtual/augmented reality (VR/AR) and video composition, revolutionizing the field of transportation and entertainment. However, it is well-documented that Deep Neural Network (DNN) models can be susceptible to adversarial attacks. These attacks, characterized by minimal perturbations, can precipitate substantial malfunctions. Considering that 3D perception techniques are crucial for security-sensitive applications, such as autonomous driving systems (ADS), in the real world, adversarial attacks on these systems represent significant threats. As a result, my goal of research is to build secure and robust real-world 3D perception systems.Through the examination of vulnerabilities in 3D perception techniques under such attacks, my dissertation aims to expose and mitigate these weaknesses. Specifically, I propose stealthy physical-world attacks against MDE, a fundamental component in ADS and AR/VR that facilitates the projection from 2D to 3D. I have advanced the stealth of the patch attack by minimizing the patch size and disguising the adversarial pattern, striking an optimal balance between stealth and efficacy. Moreover, I develop single-modal attacks against camera-LiDAR fusion models for 3D object detection, utilizing adversarial patches. This method underscores that mere fusion of sensors does not assure robustness against adversarial attacks. Additionally, I study black-box attacks against MDE and OFE models, which are more practical and impactful as no model details are required and the models can be compromised through only queries. In parallel, I devise a self-supervised adversarial training method to harden MDE models without the necessity of ground-truth depth labels. This enhanced model is capable of withstanding a range of adversarial attacks, including those in the physical world. Through these innovative designs for both attack and defense, this research contributes to the development of more secure and robust 3D perception systems, particularly in the context of the real world applications.
- Subject Added Entry-Topical Term
- Unmanned aerial vehicles.
- Subject Added Entry-Topical Term
- Autonomous vehicles.
- Subject Added Entry-Topical Term
- Neural networks.
- Subject Added Entry-Topical Term
- Aerospace engineering.
- Subject Added Entry-Topical Term
- Robotics.
- Added Entry-Corporate Name
- Purdue University.
- Host Item Entry
- Dissertations Abstracts International. 86-04B.
- Electronic Location and Access
- 로그인을 한후 보실 수 있는 자료입니다.
- Control Number
- joongbu:658330
ค้นหาข้อมูลรายละเอียด
- จองห้องพัก
- 캠퍼스간 도서대출
- 서가에 없는 책 신고
- โฟลเดอร์ของฉัน