The podcast explores the escalating costs of memory in semiconductors and its wide-ranging effects on the tech industry. It highlights how the increasing demand for high-bandwidth memory (HBM) in AI accelerators is squeezing the supply and driving up prices of DRAM for consumer devices like gaming GPUs. This shift is incentivizing memory manufacturers and companies like NVIDIA to prioritize AI chips over consumer products due to better profit margins. The conversation touches on potential solutions like context memory and optics, but emphasizes HBM's unbeatable performance for AI applications. The hosts also analyze the capital expenditure plans of major hyperscalers like Google, Meta, and Amazon, discussing how their investments in AI infrastructure are reshaping the semiconductor landscape.
Sign in to continue reading, translating and more.
Continue