Who said the data center needs a foundation? Sun Microsystems was clearly thinking outside the box, or maybe all about the box, when it came up with Project Blackbox.
Whatever its thought processes, the systems vendor is clearly bullish about its data-center-in-a-box concept. Not only is it hauling a 20-foot shipping container around the nation to roadshow its wares, it is now also extending the tour and taking the large black container all the way to Europe. Clearly, Sun believes it has found a winner.
"Project Blackbox is a high-density, low-cost data center configured in an enhanced 20' x 8' shipping container for ease and speed of deployment," says Maurice Cloutier, senior product manager of Project Blackbox. "It is aimed at customers who are running out of space, need to minimize their investment, ease the pain of building new data centers, add a DR [disaster recovery] site quickly or lower power consumption."
Fully configured, this unit contains eight standard 19 inch racks almost 300 rack units for servers, storage and networking gear. Each rack can take up to 25 kW of capacity. That's about 20,000 pounds when fully loaded.
"We looked at what would be the biggest chunk of computing we could build that could be organized in repeatable modules and would remain cost effective," says Cloutier. "Project Blackbox can be used for temporary or permanent deployments."
He estimates net prices could range from $1.5 million when filled with 250 Sun Fire X2100 servers (each using one dual-core AMD Opteron Model 180 processor) and up to $5 million for a Project Blackbox loaded with 14 Sun Fire E2900 servers (each with 12 UltraSPARC IV+ dual-core 1.8GHz CPUs).
|A Project Blackbox Container|
The version Sun unveiled for the Los Angeles leg of the tour, however, was only partially loaded. It contained a Sun Blade 8000 unit with 10 blades, two Netra T2000 telecom servers, a StorageTek S310 NAS appliance, one Sun Fire E2900 server, one Netra V210 server, one Sun Fire V490 server, one StorageTek 6140 and a combination of 15 more Sun Fire servers (V490s and X2100s).
"All you need are a relatively flat surface, a source of power, chilled water and network connections," says Cloutier.
The Blackbox data center has a two-inch water feed that brings in water cooled to 55 degrees Farenheit and pumped in at between 40 and 60 gallons per minute. Two power panels are present for redundancy in case the organization decided to have the container hooked up to two separate power grids. Alternatively, a generator can be used to power it up; 208-volt AV power is required.
Cabling is neatly arranged overhead. When a rack is moved into the aisle for maintenance, the cabling cradle folds over so wires are easy to connect and disconnect.
The first rack is known as the control rack. The display model contained networking gear, a power distribution unit and a dehumidifier The default networking connections are Ethernet (four ports), although Fibre Channel, iSCSI and InfiniBand are also supported.
According to Cloutier, Sun is willing to sell the container itself without Sun servers. Thus, customers can buy the customized unit complete with the cooling system for about a half-million dollars. But the deal is sweetened for Sun server customers. Although no prices have been set, the cost of the container, when filled with Sun servers, comes down to $300,000 or $400,000.
Sun builds the Blackboxes in a factory in Oregon. It also plans to make them in Scotland to service the European market. An Asian manufacturing location is yet to be decided. Cloutier says that orders can be filled within 90 days. Although some have already been deployed by early access customers, it will not be in general release until July.
Probably the most striking facet of Blackbox is its cooling design. Traditional room cooling uses large coolers at the perimeter that pump air under the floor that then comes up via perforated tiles into the cold aisles. Recently, vendors such as APC-MGE of West Kingston, RI, and Liebert Corp of Columbus, Ohio, introduced supplemental cooling systems that place cooling units either beside or above the racks.
Sun's cooling concept is completely different: Air flows within the container in a closed-loop arrangement. Two sets of doors exist. The first one opens the box itself. The other contains the aisle between the two racks of servers.
|Sun's Closed-Loop Airflow Design|
"Air flows in a circular path with fans and heat exchangers between each rack," says Cloutier.
Racks don't face each other. Instead, they are turned sideways and have a chiller between each rack. Thus, cold air enters the front of one server and hot air flows out the back, right into another server, and so on, down four racks. The hot air from the last server is then directed sideways and flows down the other side of the container through four more racks and four more chillers.
The required chiller size depends on the payload. For a maximum payload of 200 kW, a 60-ton chiller is required. Rector believes this cooling design is at least 20 percent more energy efficient than a standard computer room AC system.
Gaskets prevent the air from escaping into the aisle. In the event of a water leak, the water goes to the floor and cannot get into the server. The servers are also six inches off the floor in case of external flooding. Sun has developed sensors for Project Blackbox called Sun SPOT (Small Programmable Object technology). These environmental sensors detect water, temperature, air flow and other factors and adjust the system accordingly. They also send alarms about potential threats.
Jonathan Eunice, an analyst at Illuminata in Nashua, N.H., believes modular computing on this scale is the future of IT. He sees a wealth of possible uses. A container, for example, could be stored in a warehouse and quickly transported to the disaster site. Military organizations, too, might airlift them in to support remote operations.
"Blackbox truly introduces a new kind of module for data center construction," says Eunice.