In this episode of Machine Learning Street Talk, Dr. Maxwell Ramstead and Jason Fox from Noumenal discuss the limitations of large language models (LLMs) in achieving true physical AI. They argue that the lack of embodiment and real-world interaction prevents LLMs from developing adequate world models, as these models are stuck in "data space" and tethered to reality only through human preferences. They propose an alternative approach that emphasizes grounding AI in the physical world, enabling systems to generate their own data through embodied exploration and interaction. This involves creating composable, situationally specific models that can be combined and adapted, drawing inspiration from how the brain evolved. They envision a marketplace of models where data and model ownership are decentralized, and users can monetize their contributions, contrasting this with the limitations and potential pitfalls of current LLM-centric approaches.
Sign in to continue reading, translating and more.
Continue