Chey Tae-won unveils strategy built on deepening alliances with OpenAI, Nvidia, AWS to deliver more efficient AI solutions

SK Group Chair Chey Tae-won delivers a keynote speech at SK AI Summit 2025 at COEX Auditorium in southern Seoul on Monday. (SK Group)
SK Group Chair Chey Tae-won delivers a keynote speech at SK AI Summit 2025 at COEX Auditorium in southern Seoul on Monday. (SK Group)

Amid surging global demand for artificial intelligence, SK Group Chair Chey Tae-won said the tech-to-energy giant will provide the “most efficient AI solutions” through deeper partnerships with tech giants like OpenAI, Nvidia and Amazon Web Services.

Chey said pursuing efficiency is the key to addressing what he described as the “mismatch” between surging AI demand and the limited supply of chips, energy and AI infrastructure — a bottleneck that is constraining industry growth.

“It’s time for AI to shift from a competition of scale to a competition of efficiency,” Chey said at the SK AI Summit 2025 held at COEX in southern Seoul. “If we continue to fight only over scale, it will cost enormous amounts of money and create inefficiencies. We need to find ways to use resources more efficiently.”

SK is pursuing solutions across three key areas: memory semiconductor, AI infrastructure and AI applications to achieve that shift.

Chey explained that though AI chip performance, such as graphics processing units, keeps improving, memory supply is struggling to keep pace.

“Nowadays, we have been receiving requests for memory chips from many companies, and I am seriously concerned about how to meet all the (demands),” he said, citing OpenAI’s recent request for SK hynix to provide 900,000 high bandwidth memory wafers per month for its mega-scale AI infrastructure project, Stargate. This, Chey noted, is roughly twice the total global monthly production capacity of HBM at all companies.

To address the memory bottleneck, Chey said SK Group will expand its production capacity and enhance technology.

The new Yongin semiconductor cluster, due in 2027, is capable of producing the equivalent of 24 M15X fabs — the company's flagship production center for HBM chips.

On the technology front, Chey said SK hynix will develop ultra-high-capacity memory chips and adopt NAND-based designs that offer greater data efficiency at lower cost.

“SK hynix’s technological capability has already been well proven in the industry,” Chey said confidently, adding that even Nvidia CEO Jensen Huang “no longer asks us about development speed.”

Chey stressed that SK’s AI vision cannot be achieved alone, stressing partnership, such as its connection with tech giants such as OpenAI and AWS, as the core of its strategy.

OpenAI CEO Sam Altman, appearing via video message, said the partnership with SK Group is essential to make powerful AI models and to make advanced intelligence accessible.

“We imagine a future where each person has their own intelligent AI assistant working on their behalf continuously, but achieving this requires a massive, coordinated infrastructure investment,” said Altman. “No single company can achieve this alone, making a partnership like the one between our two companies essential.”

At the summit, SK Telecom’s new CEO Jung Jai-hun, in his first public remarks, echoed the importance of its own partnership with OpenAI and AWS and outlined his ambition to turn South Korea into a global hub for AI infrastructure.

“By attracting global capital and technology, we will help Korea rise as a central hub for AI infrastructure,” Jung said

Jung said the ongoing AI data center project in Ulsan, built in partnership with AWS, will be expanded to a total capacity of over 1 gigawatt, while another major AI data center with OpenAI will be constructed. SKT also plans to expand its AI data centers into global markets, including Vietnam, Malaysia and Singapore.

SK hynix CEO Kwak Noh-jung announced a new vision, to evolve from just a supplier of AI chips into what he describes as a “full-stack AI memory creator,” solving problems and co-designing the memory solutions together with the customers.

Kwak also unveiled SK hynix’s HBM roadmap with a timeline for new releases. Beginning in 2026 to 2028, the chipmaker plans to launch 16-layer HBM4 and HBM4E variants, followed by HBM5 and HBM5E from 2029 to 2031.


sahn@heraldcorp.com