
At the centre lies the “Stargate” initiative, OpenAI’s flagship infrastructure programme. Under the agreement, Samsung Electronics will act as OpenAI’s strategic memory partner, tasked with meeting a projected demand of up to 900,000 DRAM wafers per month. Simultaneously, Samsung SDS will work alongside OpenAI to design, deploy and manage AI data centres, while Samsung C&T and Samsung Heavy Industries will explore floating data centre platforms and integrated maritime infrastructure.
Samsung’s scope spans from memory production and packaging innovation to deploying data centre architecture on land and water. Floating facilities are seen as a route to tackling land scarcity as well as reducing cooling and emissions costs, though achieving operational viability at scale remains technically challenging.
OpenAI has said that memory demands for Stargate will be immense, with the Korean partners positioned to shoulder a significant share of that requirement. In parallel, these Korean entities will explore joint ventures in data centre deployment, operation and reselling of OpenAI’s enterprise services within Korea.
OpenAI CEO Sam Altman, in Seoul for the signing, described the partnerships as essential to scaling compute capacity worldwide. Samsung’s top executives — including the heads of its electronics, shipbuilding and IT services divisions — were present, signalling high-level commitment to the collaboration.
This move comes as OpenAI is simultaneously diversifying its hardware base. Alongside Samsung, it has inked a substantial multi-gigawatt agreement with AMD to supply GPU compute power. That deal commits to deploying six gigawatts of AMD Instinct series GPUs, beginning with one gigawatt in the second half of 2026, and confers to OpenAI the option to acquire up to 160 million shares of AMD, contingent on performance and price milestones.
The Samsung–OpenAI pact complements this GPU agreement by addressing the memory and infrastructure tiers of the AI stack. Observers interpret this as a bid by OpenAI to gain independence from traditional cloud providers and hyperscalers, instead building a vertically integrated AI stack combining compute, memory and facilities under its control.
Challenges are abundant. The move into floating data centres and maritime infrastructure, while visionary, confronts engineering, regulatory and economic headwinds. Land-based data centres already demand intricate cooling, power and real estate planning; translating that to a marine environment adds complexity. The sheer volume of memory scaling — potentially consuming a large share of global DRAM output — also tests supply chain resilience. Analysts caution that tying memory, compute and infrastructure too tightly could expose OpenAI to cascading risks if any component falters.
Yet the cooperation also augments South Korea’s strategic position in the global AI race. The government has made clear that it wants to position Korea among the top three AI nations. The new alliance is likely to reinforce local investment, talent development and infrastructure build-out, particularly in regions outside Seoul where new data centre sites are being actively explored.
Topics
Technology