Blog

Allied Control Logo

Showing off DataTank 500kW Racks and 1.4MW Container Units

by in Immersion Cooling
  • Font size: Larger Smaller
  • Subscribe to this entry
  • Print

With SC14 almost behind us, I would like to share a few technical details of what we are currently up to, and how this will eventually lead to a true scalable and efficient Exascale immersion cooling platform for high performance computing (HPC).

500kW in a Single 19-Inch Rack

What if I told you we took the complete 500kW Immersion-2 facility, all 20 racks and 60 tanks of it, reduced and simplified the mechanical structure further, and put all of it into a single 500kW rack? What if I'd continue, mentioning we take six of them and install them into a 40' container data center that we can ship all over the world here inexpensive hydro electricity is available in abundance? That's where we are at today and we are by no means finished.

As you can see, it's the megawatt scale where the elegance of passive two-phase immersion really shines, and in reality our systems are so much more than just cooling (fully integrated networking, high current power distribution, etc).

b2ap3_thumbnail_DataTank-240kW.jpg

Don't expect us to announce more Kilowatts per rack anytime soon, but please realize that we are witnessing a paradigm shift away from the old physical limitations and that it shows its true beauty at scale. There is simply no other cooling method that comes close in such a simplified implementation and I wouldn't know what could.

The take home lesson is that it doesn't matter what hardware you use, immersion tanks are essentially 19-inch "tubs" (think of 19-inch racks on their back) and they don't mind whether you put 100kW, 250kW or 500kW of hardware into them, or what kind of CPU or GPU or ASIC chip does the work.

When 1.4MW becomes 2.8MW

A very important detail of our 2-phase immersion cooling technology is often overlooked. Without any changes, our systems can cool far more than the "advertised" 1.4MW. We currently fill them with 2kW and 3kW power supplies and we run a cooling loop that is sized for worst-case scenarios weather-wise.

Fill our DataTank's with denser hardware or put them into a more favorable climate (say 25°C) and you double their capacity. The truly limiting factor is physical space and the size of the power feed (400-800A three phase right now). Of course we are pushing the envelope, but that's what makes it so interesting working in this field, doesn't it?

From ASICs to CPUs to GPUs to Exascale

We are mostly sticking to 19-inch rails for the moment and have been playing with OpenRacks for a while. Immersing standard 19-inch hardware (by removing heatsinks and fans) is certainly possible but we are actually going further with our so-called "immersion slots" (design guidelines).

At slightly less than 1U (a bit more than 40mm) they currently hold 2 power supplies each and up to 8 immersion blades with 550-1100 Watt per board. Most clients focus on the oversize board form-factor at the moment, 4 boards of 282.5mm x 400mm and each around 1100W.

There is no height limitation actually and some boards stick out of the fluid on the top for interconnects. In the picture below you see boards with a height of 300mm, these are the only pictures I can show without violating non-disclosure agreements.

b2ap3_thumbnail_proto-slot1b.JPG

The slots don't need screws or brackets, they simply slide in and out from the top - with or without the system running (hot-swapping). You can pull out either the slots completely with PSUs and boards, or the boards on their own. Connections are usually made from the top, and we have an alternative design using busbars at the bottom of the tanks. What we try to avoid at all costs is cables (just for the mess of it), but not all clients are up to the task yet (hint) ;-)

With the ever growing need for HPC and cloud computing, it would feel extremely counter-productive to me if I would be trying to scale up to higher densities or Exascale with anything less than 2-phase immersion. What you see in the picture below is a tank filled with 100 slightly customized Supermicro 2000W power supplies, from which we actually pull much more than their (air-cooling) rated 2000Watt with our changed cooling config. We also don't have fans in them.

b2ap3_thumbnail_DataTank240kW-noMB.jpg

Prefabricated Containerized Data Centers

For the Bitcoin miners, we are installing six of our tanks into a prefabricated 40' ISO container mining unit called DataTank, which then sits underneath our full custom open frame dry cooler container unit - both are certified to ship internationally, and are conform to international standards such as UL, IEC and CE. Both our integrated 2500 Amere power distribution busways  (add a second one if you need to) and our more than 2.5MW cooling system is seismic zone 4 compliant and of modular nature.

While we are headquartered in Hong Kong, we surely don't skimp on materials and engineering expertise: vital parts such as electrical systems are made in Germany and the US, and only our custom containers are made in China where probably most of the world's metal boxes are made. Our containers are insulated and made to work in harsh environments including hot or cold, there is no difference in cooling performance on the hardware.

b2ap3_thumbnail_DataTankLifting.jpg

For our mining clients, the value of turnkey DataTank systems comes from extremely low costs (<$0.5/W), dramatically reduced construction overhead, a simplified approach to deployment, ultra-high densities, unparalleled cooling, power and deployment efficiencies, and the reusability of all parts.

Additional benefits of the container design include a modular and portable approach, with repeatable, pre-engineered, prefabricated, and quality-assured building blocks that together bring online the exact hardware and infrastructure needed for efficient mining.

They are always in a rush to add new capacity at remote places where hydroelectricity is available at low costs - short turnaround times of less than 12 weeks are very important as you can imagine.

1.4MW Greenhouse Heater

The heat requirement of only one single greenhouse with around 6000 square meters (the size of about 1 football field or around 1.5 acres) is just about the same 1.4MW that our DataTank containers "produce", and so it is understandable that there are efforts to re-use the heat to grow plants.

Heating and electricity is one of the most considerable costs in the operation of greenhouses across the globe, especially in colder climates. With greenhouse coverings having a low thermal resistance, a great amount of money is spent to continually replace the heat lost. Most greenhouses use gas, oil or electric power for heating.

b2ap3_thumbnail_288.jpg

Date centers try to be efficient in this day and age and there are a few that seem to reuse their waste heat. Unfortunately it seems very difficult to recapture a data center's heat in a meaningful way since it mostly comes in the form of hot air. Google and Facebook, to name a few, pride themselves in having very efficient cooling for their data centers, but very little (if anything at all) of their 260 MW (Google) and 78 MW (Facebook) total power consumed is reused in any way.

Our DataTank containers work with a closed-loop hot-water "cooling system", think of them as large heaters that can be connected directly to the central heating pipes of greenhouses, offices or schools.

Some of the Bitcoiners go even further and start to deploy their mines near places with abundant renewable electricity (hydro or wind). When electricity is produced in excess, and can't be moved across large geographical regions where it may be required, digital currency is mined instead - and travels without physical borders or large voltage drops or construction costs over long distances. Bitcoins move freely.

It seems that Bitcoin has some use after all - Hong Kong's domestic helpers who recently started using it to send money back home instead of paying top dollars to the usual money transmitter would fully agree with me ;-)

Invitation to see DataTank in Action

We’ll soon be revealing our DataTank container data centers to the HPC and DC industry. If you'd like to get in touch and look at our systems in person, please feel free to send me an email using This email address is being protected from spambots. You need JavaScript enabled to view it.We are always looking for fellow engineers to join forces or users of our technology.

 

 

 

blog comments powered by Disqus
 Our Address:
Global Trade Centre
Units 305-307, 3/F
15 Wing Kin Road 
Kwai Chung, N.T.
Hong Kong

Contact Us:
Phone +852 3145 0055
Fax +852 3010 0802

Email Us
Language:Write to us in English,
German or Chinese.

Global Projects

Hong Kong is one of the fastest-paced cities in the world. It has remained the world's freest economy since 1995, with low tax, no import and export restrictions, free trade and travel. Being located in the financial powerhouse and tech hub of Asia, we are well equipped to work on global projects. No matter how big, or small.

Hot & Humid

Most would consider Hong Kong's cramped living conditions, sky high property prices, and hazy skies as very challenging. When it comes to data centers, hot and humid climates are one of the biggest problems too. Hong Kong's power hungry infrastructure is a major disaster for the people's wallet and the environment. On the bright side, it's habitats like this that push companies to go the extra mile and make a change.