Google Data Center: Repurposed Facility, Data Center Sustainability
Google boasts some of the most advanced, innovative data centers in the market. Take, for example, the company’s facility in Hamina, Finland. A former paper mill turned data center utilizes the surrounding seawater as a cooling source, making it one of the most efficient facilities in the world.
However, the search engine giant did not always house its IT equipment in state-of-the-art buildings. PC Mag recently reported on the “Google cage,” the company’s first data center, constructed in Santa Clara, Calif. Urs HÃ¶lzle, one of the first few Google employees, explained that the organization’s first facility was a space of only seven by four feet. The “cage,” as it is referred to, held around 30 PCs, and represented one of the first data center colocation facilities in the area.
HÃ¶lzle said the cage included the main servers which built and supported the Google search index, and another set of servers that “were the crawl machines.” At first, the company purchased 2 megabits of bandwidth, which cost around $1,200 per megabit each month. These were enough resources for around one million daily searches, although Google didn’t reach this capacity for months after the purchase.
Google has come a long way since its first data center cage, now operating several facilities in North America, Asia and Europe. Additionally, they weren’t the only company to reuse an existing structure, adjusting it to suit their own purposes, as the organization did with its Finnish data center.
TechTarget’s Erin Watkins recently spoke with Douglas Alger, author of the new book “The Art of the Data Center.” Alger said one of the most interesting data centers he saw when writing the book was a facility in Barcelona, Spain, which is housed in a former chapel built in the 1920s.
In addition to repurposing structures to house data centers, there are a few other trends currently being widely adopted by organizations in the sector. Alger noted that today’s facilities have rising power density levels mainly due to more powerful computing systems.
Another trend surrounds the data center temperature and cooling arrangements. Alger pointed out that industry standards have called for higher temperatures in the server room, and IT equipment “doesn’t seem to be getting coddled as much as it was in the past.” Current hardware systems are being constructed to withstand a wider range of conditions. However, eliminating excess heat created by these systems is still a priority. A number of data centers are currently using liquid cooling, or have plans for adjustments to allow for this practice in the future.
“[T]he big bow that goes on top is that data centers can be green,” Alger said. “This doesn’t mean they won’t consume anything, because they will. But people are finding ways to optimize everything from the back-of-house systems – what’s happening with power and cooling – to the hardware that they’re choosing, so they’re able to get a lot more productive work done with consuming fewer resources.”