The emergence of embodied intelligence frameworks has significantly advanced the field of robotics, particularly in mobile manipulation. However, the complexity of algorithm development within the real-to-sim-to-real cycle, coupled with heterogeneous and incompatible robot platform interfaces, often impedes seamless solution migration and deployment. To address this need, we introduce BestMan, a versatile platform designed to facilitate the integration and deployment of mobile manipulation algorithms developed in simulated home environments. Specifically, an efficient real-to-sim twin architecture is designed to reduce the virtualization difficulties, which accelerates the digital construction of real-world assets. Following that, we propose a fully modular functional architecture that integrates and collaborates perception, planning, and control capabilities with state-of-the-art algorithms to enable the robots to perform diverse tasks. Then, to support seamless migration, we introduce a unified middleware abstraction layer for the simulation and real device interface structure. Finally, an exemplar workflow across various representative tasks demonstrates the feasibility and flexibility of algorithm deployment using BestMan. Extensive functional comparisons against competitive frameworks demonstrate BestMan's comprehensiveness and adaptability. The open-sourced code is available at https://github.com/AutonoBot-Lab/BestMan.
@inproceedings{Yang2024BestManAM,
title={BestMan: A Modular Mobile Manipulator Platform for Embodied AI with Unified Simulation-Hardware APIs},
author={Kui Yang and Nieqing Cao and Yan Ding and Chao Chen},
year={2024},
url={https://api.semanticscholar.org/CorpusID:273403368}
}