Underwater Data Centers Are Moving From Experiment to Commercial Reality

In 2018, Microsoft sank a shipping container full of servers into the waters off Scotland's Orkney Islands. Project Natick, as the experiment was called, was designed to test whether underwater data centers could be practical. Two years later, when the container was retrieved, the results were striking: the underwater servers had a failure rate one-eighth that of comparable land-based servers. The sealed, nitrogen-filled environment eliminated humidity, corrosion, and human handling, the three biggest causes of hardware failure in conventional data centers.
Now, nearly a decade later, underwater data centers are moving beyond Microsoft's research labs into commercial deployment.
The Case for Going Underwater
Cooling is the single largest operational expense for data centers after electricity itself. Conventional data centers use massive HVAC systems, cooling towers, and in some cases millions of gallons of water per day to keep servers within operating temperature. Data centers account for an estimated 1 to 2 percent of global freshwater consumption, a growing concern in water-stressed regions.
Ocean water provides free, unlimited cooling. Water absorbs heat roughly 3,500 times more efficiently than air, which means a subsea data center can maintain optimal server temperatures with minimal energy expenditure. The result is a power usage effectiveness (PUE) ratio approaching 1.05, compared to 1.3 to 1.6 for typical land-based facilities. That difference represents a 20 to 40 percent reduction in total energy consumption.
The ocean also provides physical security. A data center sitting on the seabed 30 meters below the surface is immune to the physical security threats that land-based facilities must guard against: unauthorized access, natural disasters like tornados and wildfires, and even military conflict.
Who Is Building Them
Subsea Cloud, a US-based startup, has deployed the first commercial underwater data center pod off the coast of the Pacific Northwest. The company's approach uses standardized pressure vessels that can be manufactured, loaded with servers, and deployed using existing offshore engineering infrastructure. Each pod contains roughly 800 kW of compute capacity and is connected to shore via submarine fiber optic and power cables.
Highlander, a Norwegian company backed by offshore energy industry veterans, is building subsea data centers that leverage decades of North Sea engineering expertise. Norway's cold, deep fjords provide ideal conditions: stable temperatures, proximity to abundant hydroelectric power, and an offshore industry workforce skilled in underwater construction and maintenance.
Beijing Sinnet Technology, one of China's largest data center operators, has deployed a pilot subsea facility in the South China Sea. The project aims to serve the growing demand for low-latency compute in Southeast Asian markets while avoiding the land and water consumption issues that constrain data center construction in China's coastal cities.
Microsoft has not announced a commercial subsea product, but patents filed in 2025 describe a modular underwater data center system designed for deployment near coastal population centers. Industry observers expect an announcement within the next 12 to 18 months.
Technical Challenges
Maintenance is the most obvious challenge. When a server fails in a conventional data center, a technician walks over and replaces it. When a server fails 30 meters underwater, the situation is more complex. Current approaches solve this through redundancy and design-for-life-cycle thinking. Servers are deployed with enough redundancy to tolerate multiple failures over a five-year deployment period, after which the entire pod is retrieved, refurbished, and redeployed.
Power delivery requires submarine cables, which are expensive to install but proven technology already used extensively for offshore wind farms and island power connections. Latency is not a concern because subsea data centers sit just offshore, typically within a few kilometers of landing stations where they connect to terrestrial fiber networks.
Heat dissipation into the ocean has raised environmental questions. However, studies from the Natick project and subsequent deployments show that the thermal plume dissipates within meters of the enclosure, with no measurable impact on local marine ecosystems. In fact, the structures tend to act as artificial reefs, attracting marine life rather than deterring it.
Biofouling, the growth of algae, barnacles, and other organisms on submerged surfaces, is a maintenance consideration. Anti-fouling coatings, UV treatment systems, and design features that minimize flat surfaces exposed to current all help manage this issue.
The Economics
The capital cost of an underwater data center pod is currently higher than an equivalent land-based installation, primarily due to the pressure vessel, submarine cables, and deployment logistics. However, operational costs are significantly lower. Elimination of cooling infrastructure, reduced security requirements, lower land costs (ocean leases are a fraction of prime real estate), and reduced hardware failure rates combine to produce a total cost of ownership that is competitive with land-based facilities over a five-year lifecycle.
For specific use cases, the economics are even more compelling. Coastal cities with limited land and expensive power, such as Singapore, Hong Kong, and Mumbai, face severe constraints on data center expansion. Underwater deployment provides capacity growth without consuming scarce urban land or overtaxing municipal water supplies.
The Sustainability Angle
Data center energy consumption and water usage are under increasing scrutiny. Tech companies have made aggressive sustainability commitments, and the gap between those commitments and the reality of exponentially growing compute demand is a mounting credibility challenge.
Underwater data centers address both dimensions simultaneously. Lower energy consumption from free cooling reduces carbon emissions. Zero freshwater consumption eliminates water stress. And the sealed deployment model means less electronic waste from premature hardware failure.
What Comes Next
The next five years will determine whether underwater data centers become a mainstream deployment option or remain a niche solution. The technology is proven. The economics are competitive. The environmental benefits are real. What remains is scaling: building the manufacturing capacity for pressure vessels, training the workforce for deployment and retrieval operations, and establishing the regulatory frameworks that govern ocean-floor infrastructure.
If the current trajectory holds, the ocean floor may become the next frontier for cloud computing, quietly humming beneath the waves while the world above streams, searches, and scrolls.


