Underwater Data Centers

Posted byVijay Gupta21/07/20230 Comment(s)

Few are aware that the concept of underwater data centers originated during Microsoft's 2014 Think Week, an internal brainstorming session. It was proposed by an employee with experience operating Navy submarines. Microsoft conducted initial experiments, submerging a data center underwater for five months with promising results.

Most recently, in 2018, Microsoft sent a submarine packed with 864 servers and capable of storing 27.6 petabytes of data to the bottom of the Orkney Islands, northeast of Scotland. Two years later, in June 2020, Microsoft brought it ashore for evaluation. The results show that the submarine data center performs better than traditional data centers in all aspects, and the failure rate in water is one-eighth that on land.

 

Why is Microsoft putting its data center under the sea? After the pilot, can the construction of submarine data centers be promoted on a large scale?

 

Why Microsoft Chooses Underwater Data Centers

The answer is simple: maximize the resources available. At the same time, it solves many shortcomings of land data centers.

First, undersea data centers are safer and more stable: data centers are delicate and filled with highly sophisticated components that can be damaged by temperature changes, oxygen corrosion, and even collisions when replacing damaged parts. But in a vacuum environment where temperature can be controlled, oxygen and water vapor can be extracted, and human interference can be isolated, the security and stability of the data center will be greatly improved.

 

Under the sea is undoubtedly an ideal data source - not only isolated from the land's oxygen, water vapor, and put an end to human interference.

Second, and most important, seawater cooled servers have a unique advantage, and cooling is a major expense for land-based data centers. According to public data, 41% of the annual electricity cost of a data center is used for cooling, and the annual electricity consumption of data centers worldwide accounts for about 2% of the world's total electricity. Among them, the cost of energy consumption accounts for 30% to 50% of the entire IT industry.

 

Why is cooling so expensive? In fact, in land data centers, there are usually two ways to cool data, one is to use mechanical cooling, that is, to cool the server with a heavy air conditioning system, but this cooling method needs to consume a lot of electricity every day, and the cost has been high.

 

The other is to cool the server by air and water evaporation. This naturally gifted method is much lower in cost than the former, but it also has its own shortcomings: the completion degree and cooling quality are determined by the external air temperature and water conditions, and the human maneuverability is too low.

 

Seawater with a higher heat capacity can store excess heat generated by the data center: only a heat exchanger is needed to transfer the heat of the data center to the surrounding seawater, so to speak, is a combination of two traditional methods of cooling: stable and free use of natural resources.

Third, the coastal population density is high, data transmission is fast, and cloud computing efficiency is higher: in order to save land and operating costs, traditional data centers usually choose sparsely populated remote areas, which directly leads to too slow data transmission and too much delay. Subsea data centers are different:

About 50 percent of the world's population lives within 150 kilometers of a coastline. Building the data center under the sea saves costs and is close to residential areas, killing two birds with one stone.

 

In addition, there are many other advantages:

For example, we can use the tidal energy of the ocean to obtain carbon-neutral electricity in the ocean; Undersea bandwidth can be connected through pipelines to accelerate data transmission; Traditional red tape can be circumvented when building an undersea data center: servers can be built in watertight silos on assembly lines and shipped out to sea by cargo ship for deployment. As Microsoft says, these server pods can be deployed within 90 days; whereas raditional data centers take one to two years to build.

 

Theoretically, submarine data centers have many advantages, so how difficult is it to achieve? -- Microsoft has the first answer.

 

Microsoft's Project Natick And Actual Construction

In fact, as early as 2015, Microsoft began to study the feasibility of building data centers under water, and then launched Project Natick.

In the first phase of the Natick project, in 2015, the Microsoft research team conducted a 105-day experiment to maximize leakage protection, ensuring that the data center was placed in a waterproof container. The experiment was a success: Microsoft found that the water resistance of the service module could be guaranteed in seawater.

 

So in the second phase, Microsoft is trying to push the experiment forward and land the project: "send the data to the ocean floor" to see if the data can be preserved in good condition after a few years. Microsoft put a data center in a sealed steel container, filled it with nitrogen, and then used a submarine to transport the container into the sea.

 

The experiment was supported by the European Marine Energy Centre (EMEC) : EMEC not only provided expertise in renewable energy support, but also acted as a geographical adviser around Orkney - EMEC even provided the undersea cable that connects the data centre to the coast.

 

The submarine carrying the server into the deep sea is called Leona Philpot, a character from the game Halo. It sailed into the darkness of the North Sea near Ornik, Scotland.

 

Why Orkney? On the one hand, because Orkney is a major centre for renewable energy research, the European Marine Energy Centre (EMEC) has been experimenting with tidal and wave energy here for 14 years. On the other hand, Orkney has a cold climate, which helps to reduce cooling costs for data centers.

 

Microsoft has placed the data centre less than a kilometre off the seabed and deployed environmental sensors inside white, high-pressure compartments to monitor its status in real time. The data center and the ocean are "seamless" : their power needs are captured via undersea cables, and data is readily transmitted to the wider world off the coast. In 2018, the Microsoft North Sea data center was completed: a total of 864 servers, 27.6PB of memory, in order to test performance, a deep dive for two years.

 

In fact, the researchers are most concerned about data center damage: once the computers in the underwater data center fail, they cannot be repaired. Fortunately, it worked out well. By August 2020, all of the computers were salvage-only eight out of more than 800 failed, a lower failure rate than in land-based data centers.

 

How to achieve low loss? The project's researchers speculate that, on the one hand, the cold weather acted as a buffer; On the other hand, nitrogen also plays a protective role. In short, this small-scale test further validates the possibility and value of subsea storage. Project researchers said that the project not only has a low failure rate, but all of the power supply of the data center comes from wind and solar energy, making full use of natural resources.

 

In addition, corresponding to the theory, the management cost, construction cost and loss in the face of natural disasters and other emergencies of the submarine data center are all lower than that of the land data center.

 

However, this is only a temporary victory. The volume of more than 800 servers is far from that of land-based data centers - after all, land-based data centers have tens of thousands of servers. In a sense, this data center is more experimental than practical, and can be said to be a small pilot project for Microsoft. Microsoft CEO Satya Nadella said the underwater data center will replicate Project Natick around the world.

 

Challenges And Future Outlook Of Underwater Data Centers

If Microsoft wants to successfully promote the undersea data center, it cannot do without breaking the difficult problem at this stage:

 

 

 

First, Microsoft's experiment has been met with a lot of environmental skepticism. Ian Bitterlin, a professor of data studies, believes that the heat generated by data centers can affect ocean water temperatures. How to prove that the undersea data center will not cause greater pollution to the Marine environment and how to avoid possible pollution risks needs to be solved by the Microsoft team.

 

Second, the damage of 8 servers in more than 800 servers does not seem to be a large number, but once the submarine data center is promoted, the loss is likely to be hundreds of thousands of units, then the need to build the corresponding underwater maintenance service station, as well as complete equipment maintenance solutions.

 

Third, as Ian Bitlin points out, the coast is not the best place to build a data center - even though the traffic on the coast is much higher than in the wilderness, it is still not as extensive as the data center in the big city.

 

Of course, the Natick project is not just a boost for undersea data center construction. Even if undersea data centers don't scale, these creative experiments offer valuable lessons for the data center industry.

 

For example, when building an underwater data center in the Ornik Islands, the team was inspired by the electricity provided by wind and solar power - the researchers said that in the future, they could consider deploying underwater data centers with offshore wind farms, borrowing wind energy to power the data center, killing two birds with one stone, or even tying onshore power lines to the optical cables needed to transmit data.

As a result, Microsoft is looking for ways to replicate the benefits of the subsea model into land-based data centers - such as low server wear and high security.

 

Project Natick has the potential to revolutionize data center deployment, providing flexibility, rapid construction, and efficient scaling. While Microsoft envisions replicating the success of Project Natick globally, challenges include environmental concerns and the need for underwater maintenance stations in the event of widespread deployment. Microsoft's experiments not only push the boundaries of technology but also offer valuable insights for the whole industry. Microsoft's innovative approach, whether successful or not, signifies a significant step forward in the data center industry.

Track Order