BrainBody-LLM algorithm helps robots mimic human-like planning and movement


Professor Farshad Khorammi led a team that developed BrainBody-LLM, an algorithm that enables robots to plan actions, adjust movements in real time, and learn from feedback using large language models. The system mimics human brain-body communication with two components: a Brain LLM for high-level planning and a Body LLM for translating steps into robot commands. "The primary advantage of BrainBody-LLM lies in its closed-loop architecture, which facilitates dynamic interaction between the LLM components," said Ph.D. candidate Vineet Bhat, the paper’s lead author. Tests increased task completion rates by 17% in simulations.