In this interview, Matthew Berman interviews Greg Brockman about the scaling of AI models like Sora, the transformer architecture, and the costs associated with different models. They discuss hardware, including AMD, Cerebrus, and Grok, and the bottlenecks in compute and energy. Brockman shares insights on OpenAI's internal conversations about compute allocation and the shift towards AI agents browsing on behalf of users, changing the internet experience and monetization. The conversation touches on the proactive nature of AI, the balance between autonomous thinking and task completion, and the development of Sora 2 as a social experience. They also discuss the role of language models, the impact of AI on jobs, and the potential for fully generated software and UIs. Finally, they explore the Agents of Commerce Protocol and predictions for future AI advancements, including AGI.
Sign in to continue reading, translating and more.
Continue