For anyone with romantic ideas about what having our data stored ‘in the cloud’ actually entails, the distance between some omnipresent, stratospheric hard drive, and the reality of gigantic, monotonous data centres crammed together in identical warehouses, is disappointingly vast.
Like their implied location, however, the world’s data centres have an ongoing heating problem – they require a huge amount of energy in order to remain constantly chilled. Hence, gigantic data centres can be found in such remote locations as in the Arctic Circle, where nature’s natural climatic systems can counterbalance the server-generated heat. Tech giants such as Google and Facebook have also announced plans to install vast wind and solar farms, in order to generate the necessary power to cool the centres that aren’t located in such extreme conditions.
Now an innovative plan has thrown a fresh idea into the mix. Dubbed ‘Project Natick’, this Microsoft research project hopes to build fully-operational underwater data centres, which can be both cooled and powered by the sea currents themselves. ‘As we started exploring the space, it started to make more and more sense,’ says Norm Whitaker, Managing Director of Microsoft Research Special Projects. ‘We had a mind-bending challenge, but also a chance to push boundaries.’
Natick was first proposed in 2013. It began development in late 2014, and resulted in a prototype vessel, the Leona Philpot, (named after a character from the Microsoft videogame Halo who broke her spinal column diving into water) having an exploratory three-month dive half a mile out into the Pacific Ocean from August to November last year. Developers were able to fully analyse its local environment, such as measuring temperature, humidity, power usage and surrounding current speeds. ‘In one day this thing was deployed, hooked up and running. Then everyone was back here, controlling it remotely,’ says Whitaker. ‘A wild ocean adventure turned out to be a regular day at the office.’
Unlike data centres in the Arctic or other extreme, cool environments, the advantage of an underwater system means that the linking cables have a vastly reduced distance to travel to the urban centres where people actually need them. With half of the world’s population living within 120 miles of the sea, this technology could potentially speed up data transmission significantly. ‘This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,’ says Whitaker.
There have, of course, been questions asked regarding the environmental impacts of underwater data centres. Microsoft insists that the data centres themselves can be built from recycled parts and become entirely recyclable at the end of their 20-year lifespans. Furthermore, with no human habitation, there would be no problems regarding organic waste disposal, for example.
‘You think it might disrupt the ecosystem,’ says Christian Belady, General Manager for Data Center Strategy, Planning and Development at Microsoft, ‘but really, it’s just a tiny drop in an ocean of activity. What was really interesting to me was to see how animal life was starting to inhabit the system. No one really thought about that.’ Certainly, the exploratory dive appeared to show sea life crawling all over the vessel, seemingly quickly adapting to having the Leona Philpot as part of their environment. How true this would be in the case of a scaled-up Project Natick, with several underwater data centres from multiple companies all underwater for several years, remains to be seen.
‘There is a minimal amount of heating in the vicinity of the container,’ explains a Microsoft spokesperson, in response to separate concerns about the longterm impact of many increasingly hot underwater data centres. ‘In the ocean, the bounteous energy contained in ocean tides and waves is converted to waste heat with the movement of the water. The goal of Natick is to capture this energy and use it to do useful computing before it is also converted to heat. The result is a minimal amount of nearby heating which is quickly carried away by the current.’
The team is currently planning the project’s next phase, which could include a vessel four times the size of the current container with as much as 20 times the computing power, which could be in the water for at least a year. It appears as though the world’s vast array of undersea communications technology, encompassing everything from the 19th century transatlantic telegraph cable to modern-day fibre optics, is getting ready to be upgraded once again.