This interview podcast episode focuses on Project Astra, a Google DeepMind research prototype of a universal AI assistant. The host, Hannah Fry, interviews Greg Wayne, who explains Project Astra's capabilities, including real-time multimodal interaction (vision, audio, language), proactive memory features, and multilingual capabilities. The discussion covers technical aspects like latency reduction through hardware co-location and native audio processing, as well as ethical considerations and future development priorities, such as proactive assistance for visually impaired users. A demonstration showcases Project Astra identifying objects and understanding multiple languages. The interview highlights the ongoing development and the potential for Project Astra to become a truly helpful AI assistant.
Sign in to continue reading, translating and more.
Continue