Chastek, WilliamPoyrazoglu, GoktugCao, YukangIsler, Volkan2025-05-152025-05-152025-05-14https://hdl.handle.net/11299/271871Accurate mobile robot localization enables safe and efficient navigation. In this report, the impact of using multiple sensors for localization was studied. Two sensors configurations were tested: IMU and IMU + LiDAR. This was deployed on both a simulated Turtlebot3 and a physical F1TENTH robotic platform, and was tasked to navigate to a goal position using a logMPPI controller. For both the simulation and real-world cases, the effects of using multiple sensors on the controller accuracy were studied. Accuracy was measured by calculating the Euclidean distance between the readings given from the robot versus the ground truth. This was accomplished by tracking the robot model in Gazebo, and using a PhaseSpace system for the physical robot. Using these metrics it was found that in an idealistic environment where sensor noise is minimal, sensor fusion is not the optimal option. However, in real-world scenarios, it was demonstrated that a particle filter fusing IMU and LiDAR readings reduced average localization error by at least 11%. This means real-world robotic platforms should aim to use multiple sensors for localization, to increase accuracy.en-USRoboticsImproving Controller Accuracy: Increasing Localization Accuracy Using Sensor FusionReport