The panel discussion on data-centric computing delved into the challenges and opportunities presented by near-data processing, particularly the high costs associated with data movement in today’s computing landscape. Experts from both academia and industry shared insights on various strategies, such as embedding compute logic directly into memory (like HBM and CXL), utilizing specialized AI accelerators, and implementing high-bandwidth interconnects. While they recognized significant obstacles—such as programmability, thermal management, and the need for standardized programming models—the overall sentiment was that near-data computing is becoming more feasible, largely due to the increasing demands of AI and high-performance computing workloads. The panelists also debated the best approach to balance specialized and general-purpose solutions, emphasizing the importance of efficient data movement and optimized data layouts to enhance power efficiency and boost overall system performance.
Sign in to continue reading, translating and more.
Continue