In this roundtable discussion, memory experts Steve Woo, John Eble, and Nidish Kamath explore the critical role of memory in AI applications. They discuss how AI's rapid evolution, especially with the growth of large language models, is driving the need for higher memory capacity, bandwidth, and power efficiency. The panelists explain how AI training demands immense memory for data movement, while AI inference increases the emphasis on latency and cost-effectiveness. They discuss memory technologies including DDR5, HBM4, GDDR7 and LPDDR5 memory, along with emerging memory module solutions like MRDIMM and LPCAMM. The session underscores the continuous evolution of memory technologies to meet the unique challenges of AI.
John EbleVP, Product Marketing, Rambus |
|
Nidish KamathDirector of Product Management, Rambus |
|
Steven WooFellow and Distinguished Inventor, Rambus |
|
Tim MessegeeSr. Director of Solutions Marketing, Rambus |