Jensen Huang, the chief executive of artificial intelligence (AI) chipmaker Nvidia, asked SK Hynix to advance the delivery of next-generation high-bandwidth memory (HBM) chips by six months, the memory chip producer said, emphasising the explosive demand for the building blocks of AI infrastructure.
“The last time I met with Huang, he asked me if we could supply HBM4 six months earlier than the date we have agreed upon,” said SK Group chairman Chey Tae-won at an event in Seoul.
“I asked the SK Hynix CEO whether it’s possible, and he said he will try, so we are working to move the date up by six months.”
He joked that he is “a bit nervous” to meet Huang again because “we’re worried he might ask us to speed it up even further”.
Chey’s remarks at SK AI Summit 2024 show how producers are doubling down to ramp production of key AI infrastructure, including GPU accelerator chips and HBM memory.
Nvidia is the leading producer of GPUs for the AI industry, with more than 80 percent market share, while SK Hynix has become a key producer of HBM memory chips as Samsung has lagged in their production.
SK Hynix began manufacturing HBM3E, the current cutting-edge of the technology, in September, and is aiming to produce 12-layer HBM4 chips next year, with 16-layer HBM4 chips to follow in 2026.
In a pre-recorded video clip Huang said HBM had enabled “super Moore’s law” with AI accelerators.
Moore’s law is the observation that the number of transistors in an integrated circuit doubles about every two years, often used as a way of referring to the quickly increasing power of computer chips.
“When we moved from coding to machine learning, it changed the computer architecture fairly profoundly. And the work that we did with HBM memories has really made it possible for us to achieve what appears to be super Moore’s law,” Huang said.
“We wish that we got more bandwidth with lower energy. So the roadmap that SK Hynix is on is super aggressive and is super necessary.”
Chey also identified challenges faced by the AI industry, including the lack of “killer use cases” and revenue models to recoup heavy infrastructure investments, limited chip manufacturing capacity and the necessity of constnatly supplying AI systems with high-quality human-generated data.
He also spoke about the heavy energy requirements of AI data centres, saying SK Hynix has invested in gas turbines with carbon-capturing technology and small, modular nuclear reactors.
AI-powered Reddit Answers allows users to access information based on Reddit posts, in move to…
Former co-developer of voice mode for OpenAI's ChatGPT launches WaveForms AI to make AI voice…
OpenAI releases Sora AI video-generation tool to ChatGPT Plus and Pro subscription users amidst concern…
Tesla to initially use human back-up controllers for company-owned robotaxi fleet at launch next year,…
Chinese government opens antitrust probe into Nvidia's $7bn acquisition of Mellanox, in move seen as…
Google Willow quantum chip makes significant improvements in error correction, moving quantum computing closer to…