Robotic perception and manipulation in unstructured environments with partial observations

Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Published Date

Publisher

Abstract

This dissertation tackles the challenges of robust robotic perception and manipulation in unstructured environments characterized by uncertainty and limited sensory information. While robots excel in structured environments with controlled settings, transitioning to complex real-world scenarios calls for algorithms capable of handling unpredictable variations. Specifically, this dissertation addresses three key challenges arising from unstructured environments and partial observations: Geometric Uncertainties: The shapes, positions, and orientations of objects in unstructured environments are often unknown or unpredictable. This work explores methods to handle these uncertainties in tasks like grasping and deformable objects without precise object geometric knowledge. Through the development of force-based manipulation techniques for Velcro peeling, this dissertation demonstrates that robots can successfully peel Velcro from unknown surfaces using only force feedback, achieving 95% success rates. Additionally, a novel adaptive sampling framework for grasping moving objects improved grasping success rates by 24% over baseline methods, showing that the presented method can effectively handle geometric uncertainties in real-time. Motion Dynamics: Predicting how objects will move in response to robot actions is difficult in unstructured settings. This dissertation investigates techniques for modeling and adapting to these dynamics. The work on differentiable physics demonstrates that hidden physical parameters can be estimated through strategic action selection. Furthermore, the state decomposition particle filter estimates future Velcro states accurately, allowing peeling with less than 80% energy increase compared to optimal solutions under full observability. Sensor Ambiguities: Limited or incomplete sensory information presents another common challenge. This work develops algorithms to overcome these ambiguities, ensuring reliable perception with incomplete data. The ROW-SLAM system demonstrates robust semantic mapping in outdoor agricultural environments, achieving sub-meter accuracy in corn stalk localization despite visual occlusions, lighting variations, and sensor noise. Overall, this dissertation explores approaches from pure data-driven to pure model-based methods, as well as hybrid approaches combining both paradigms. Comparative analysis reveals that while pure approaches have merits, hybrid solutions consistently demonstrate superior performance in complex scenarios, enabling reliable robotic operation in unstructured environments.

Keywords

Description

University of Minnesota Ph.D. dissertation. June 2025. Major: Electrical/Computer Engineering. Advisors: Volkan Isler, Changhyun Choi. 1 computer file (PDF); x, 118 pages.

Related to

item.page.replaces

License

Collections

Series/Report Number

Funding Information

item.page.isbn

DOI identifier

Previously Published Citation

Other identifiers

Suggested Citation

Yuan, Jiacheng. (2025). Robotic perception and manipulation in unstructured environments with partial observations. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/277413.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.