Publications
2023
- Distributed Optimal Control Framework for High-Speed Convoys: Theory and Hardware ResultsNamya Bagree, Charles Noren, Damanpreet Singh, Matthew Travers, and 1 more authorIFAC World Congress, 2023
Practical deployments of coordinated fleets of mobile robots in different environments have revealed the benefits of maintaining small distances between robots, especially as they move at higher speeds. However, this is counter-intuitive in that as speed increases, reducing the amount of space between robots also reduces the time available to the robots to respond to sudden motion variations in surrounding robots. However, in certain examples, the benefits in performance due to traveling at closer distances can outweigh the potential instability issues, for instance, autonomous trucks on highways that optimize energy by vehicle “drafting” or smaller robots in cluttered environments that need to maintain close, line of sight communication, etc. To achieve this kind of closely coordinated fleet behavior, this work introduces a model predictive optimal control framework that directly takes non-linear dynamics of the vehicles in the fleet into account while planning motions for each robot. The robots are able to follow each other closely at high speeds by proactively making predictions and reactively biasing their responses based on state information from the adjacent robots. This control framework is naturally decentralized and, as such, is able to apply to an arbitrary number of robots without any additional computational burden. We show that our approach is able to achieve lower interrobot distances at higher speeds compared to existing controllers. We demonstrate the success of our approach through simulated and hardware results on mobile ground robots
- Subt-MRS: A Subterranean, Multi-Robot, Multi-Spectral and Multi-Degraded Dataset for Robust SLAMShibo Zhao, Damanpreet Singh, Sebastian Scherer, and othersIn submission to ICCV, 2023
In recent years, significant progress has been made in the field of simultaneous localization and mapping (SLAM) research. However, current state-of-the-art solutions still struggle with limited accuracy and robustness in real-world applications. One major reason is the lack of datasets that fully capture the conditions faced by robots in the wild. To address this problem, we present SubT-MRS, an extremely challenging real-world dataset designed to push the limits of SLAM and perception algorithms. SubT-MRS is a multi-modal, multi-robot dataset collected mainly from subterranean environments having multi-degraded conditions including structureless corridors, varying lighting conditions, and perceptual obscurants such as smoke and dust. Furthermore, the dataset packages information from a diverse range of time-synchronized sensors, including LiDAR, visual cameras, thermal cameras, and IMUs captured using varied vehicular motions like aerial, legged, and wheeled, to support research in sensor fusion, which is essential for achieving accurate and robust robotic perception in complex environments. To evaluate the accuracy of SLAM systems, we also provide a dense 3D model with sub-centimeter-level accuracy, as well as accurate 6DoF ground truth. Our benchmarking approach includes several state-of-the-art methods to demonstrate the challenges our datasets introduce, particularly in the case of multi-degraded environments.