Within the previous seven decades, there have been significant data center innovations. Today’s structures leverage best-in-class technologies to support a growing range of computing services. The data center sector first appeared nearly 70 years ago, when IBM partnered with American Airlines to establish a passenger reservations system as a means to automate one of the company’s critical operational departments. Since then, the sector has seen considerable innovations and developments in a number of key areas, including how data centers are constructed and where services are deployed.
The Early Years
One of the earliest recognized data centers was established in 1946, when the U.S. Army requested the construction of the Electronic Numerical Integrator and Computer. The structure’s original purpose was to archive firing codes, and was the first identified general-purpose electronic digital computer.
The first arrangement to leverage transistors and diodes came in 1954, when the construction of the initial transistorized computer (TRADIC) was completed. Another main innovation in the sector came about in the 1960s, when IBM developed one of the original mainframe system series, which would prove to be a milestone in computing.
The 1970s: A decade of innovations
The 1970s was a year of considerable progress in the data center industry, which kicked off in 1971 with the introduction of the first 4004 processor by Intel. This contemporary technology has been identified as the first general-purpose programmable processor widely available, and would become a “building block” for engineers as they began customizing software for a range of devices.
Then, in 1977, the industry saw the establishment of the first LAN: ARCnet. The system was initially utilized by Chase Manhattan Bank and could support 2.5 Mbps of data, connecting up to 255 computer workstations to the network.
The decade wrapped with the introduction of the first commercial disaster recovery service from SunGard in 1978.
The 1990s: The first recognizable data centers appear
During the 1990s, the computing industry saw the establishment of the some of the first structures that were considered early representations of today’s data centers. In the early 1990s, previously utilized mainframe computer rooms began being filled with microcomputers, or servers, and were first dubbed data centers. Soon after, a number of businesses started creating their own internal server rooms.
The industry saw a significant shift with the mid ’90s with the ‘.com’ era, which boosted customer demands for network and connectivity solutions. The first virtual PCs and enterprise applications appeared at this time.
Many of the systems relied upon in data centers of the past few decades seem rudimentary compared to the computing abilities of today’s structures. Many current operators have an increased focus on data center sustainability and have introduced new technology for reducing power consumption and improving facilities’ PUE rating. Additionally, the industry now utilizes several industry standards for electricity consumption, green technologies and sustainability, including guidelines from the Uptime Institute, LEED and ASHRAE.
Another key change to recently take hold in the industry is the inclusion of high density power design, like that utilized in HP’s Colorado data center. That facility can support more than 200 watts of critical power load per square foot of space.
Many of today’s facilities also leverage modular data center designs, such as that deployed at Purdue University to support research efforts. The institution uses an HP Performance Optimized Data Center due to space and power constraints on campus.