Regardless of how a company is provisioning its underlying IT infrastructure, it’s important to consider a number of new and disruptive technologies. Our data center research has uncovered three technology trends and associated solutions for addressing potential problems: 1) software defined networking, 2) microservers and 3) IT operations analytics.
1) Software-Defined Networking (“SDN”)
As IT infrastructure moves to network resident applications, and the amount and variety of data found in networked environments increases, telecommunication and data center resources are becoming more mission critical for enterprises. We call this change “rising critically” of IT infrastructure. In light of rising criticality, companies need new ways of provisioning workflows and ensuring business continuity. Software-defined networking and data center design is quickly becoming one of the most talked-about terms in the telecommunications and data center space.
According to TechTarget, SDN differs from legacy network management in that instead of using hardware to provision each workload, IT teams utilize a software controller to direct packets of data. This way, companies can build more flexibility and scalability into their data center design, two especially key benefits considering the prevalence of server virtualization.
“The software-defined data center is one where all infrastructure has been virtualized, and the management of that infrastructure is completely controlled by software that is driven by policies,” Doug Bourgeois, former director of the Interior Department’s National Business Center, told GCN. “What this emerging trend is essentially doing is taking what has happened over the last 15 years with server virtualization and bringing that to the network and storage levels. Once these three levels of the infrastructure have been virtualized, the data center becomes orders of magnitude more agile than ever before.”
In order to yield the highest return on investment from SDN, companies need to provision their data centers accordingly. In networked environments, physical switches are less beneficial and hardware designed to support software controllers rises in importance.
When many people think about data center design, often massive mainframes come to mind. While this legacy IT hardware is maintaining a material market share, microservers are becoming more common in data centers. In comparison to larger equipment, microservers have less computing power, but they are better provisioned for overall speed and flexibility. Instead of storing and powering one large system, it’s possible to install microservers to complete many smaller tasks more rapidly.
“They are designed for very high levels of parallelism, you can have tens of thousands of them chewing away at a problem,” Dave Chalmers, chief technologist for HP’s enterprise group in EMEA. “Even if each individual core is less powerful than a current generation Xeon, the fact you can have thousands and thousands of them, if the problem is of the right shape, means you can have really efficient use of the technology.”
3) IT Operations Analytics
IT teams can effectively oversee increasingly complex workloads with operations analytics. Traditionally, IT departments relied on manual processes to keep tabs on hardware and software, but this error-prone process has not followed the rising criticality of IT infrastructure. With more money at stake, IT operations analytics have come into focus, and organizations can more effectively oversee data center environments to quality assure change management and service-level agreements are met. Data center design may become more complicated, but software can ensure that server rooms are easier to manage on a systemic basis.