A data center (or datacenter) is a facility composed of networked computers and storage that businesses and other organizations use to organize, process, store, and distribute large amounts of data. Typically, a business relies heavily on the applications, services, and data contained in a data center, making it a focal point and essential asset for daily operations.
How do Data Centers work?
Data centers are not a single thing, but rather a conglomeration of elements. At least, data centers serve as primary repositories for all kinds of IT equipment, including servers, storage subsystems, network switches, routers and firewalls, and the physical cabling and racks used to organize and interconnect IT equipment. A data center must also contain adequate infrastructure, such as power distribution and supplemental power subsystems. This also includes electrical switching, uninterrupted power supplies, backup generators, data center ventilation and cooling systems, such as row cooling setups and computer room air conditioners, and adequate supply for network operator (telco) connectivity. All this requires a physical installation with physical security and sufficient surface area to house all the infrastructure and equipment.
What is data center consolidation?
There is no need to have a single data center, and modern businesses can use two or more facilities of data centers across multiple sites for greater resiliency and better application performance, reducing latency by locating workloads closer to users.
Conversely, a company with multiple data centers may choose to consolidate them, reducing the number of locations to minimize IT operations costs. Consolidation typically occurs during mergers and acquisitions, when the controlling company does not need the data centers owned by the subordinate company.
What is data center colocation?
Data center operators can also pay a fee to rent server space in a colocation facility. Colocation is an attractive option for organizations that want to avoid the significant capital expenditure of building and maintaining their own data centers. Today, colocation providers are expanding their offerings to include managed services, like interconnectivity, allowing customers to connect to the public cloud.
As many providers today offer managed services alongside their colocation facilities, the definition of managed services becomes unclear as all providers market the term in a slightly different way. The important distinction to make is this:
- Flatsharing — The organization pays a vendor to house its materials in a facility. The customer only pays for the space.
- Managed Services — The organization pays a vendor to actively maintain or monitor the hardware in some way, whether through performance reporting, interconnectivity, technical support, or disaster recovery.
Third-party data center
Data centers are not defined by their physical size or style. Small businesses can operate successfully with multiple servers and networked storage arrays in a closet or small room, while large IT companies, such as Facebook, Amazon or Google, can fill a huge warehouse space with equipment. and data center infrastructure. In other cases, data centers can be assembled into mobile facilities, such as shipping containers, also known as “data centers in a box,” which can be moved and deployed as needed.
However, data centers can be defined by different levels of reliability or resilience, sometimes called « data center tiers. » In 2005, the American National Standards Institute (ANSI) and the Telecommunications Industry Association (TIA) published ANSI/TIA-942, « Telecommunications Infrastructure Standard for Data Centers, » which defines four levels of data center design and implementation guidelines. Each subsequent level is intended to provide more resilience, security, and reliability than the previous level. For example, a Tier 1 data center is little more than a server room, while a Tier 4 data center offers redundant subsystems and high security.
The design and architecture of a Data Center
While it is conceivable that almost any suitable space could serve as a “data center,” the deliberate design and implementation of a data center requires careful consideration. Beyond basic questions of cost and taxes, sites are selected based on a multitude of criteria, such as geographic location, seismic and weather stability, access to roads and airports, availability of energy and telecommunications and even the current political environment.
Once a site is secure, Data center architecture can be designed with attention to mechanical and electrical infrastructure, as well as the composition and layout of IT equipment. All of these questions are driven by the availability and efficiency goals of the desired data center tier.
Energy consumption and energy efficiency
Data center design also recognizes the importance of energy efficiency. A simple data center may only require a few kilowatts of power, but an enterprise-scale data center installation may require tens of megawatts or more. Today, the green data center, which is designed to have minimal impact on the environment through the use of low-emission building materials, catalytic converters and alternative energy technologies, is increasingly popular .
Data centers can also maximize their efficiency through their physical layout using a method known as the hot aisle/cold aisle layout. Server racks are aligned in alternating rows, with cold air inlets oriented in one direction and hot air outlets in the other. The result is alternating hot and cold aisles, with exits creating a hot aisle and entrances creating a cold aisle. The outputs are directed to the air conditioning equipment. The equipment is often placed between server cabinets in the row or aisle and distributes cold air to the cold aisle. This configuration of air conditioning equipment is known as in-row cooling.
Organizations often measure the energy efficiency of data centers using a metric called power utilization efficiency (EUE), which represents the ratio of the total power entering the data center to the power used by computer equipment. However, the rise of virtualization has enabled much more productive use IT equipment, resulting in much higher efficiency, lower energy consumption and reduced energy costs. Metrics like PUE are no longer essential to energy efficiency goals, but organizations can still measure PUE and use comprehensive power and cooling analytics to better understand and manage energy efficiency.
Data Center Security and Safety
Data center design should also implement good security and safety practices. For example, security is often reflected in the arrangement of doors and access corridors, which must allow the movement of large and unwieldy IT equipment, as well as employee access to the infrastructure and its repair.
Fire suppression is another key area of safety, and the extensive use of sensitive, high-energy electrical and electronic equipment prevents the installation of common sprinklers. Instead, data centers often use environmentally friendly chemical suppression systems, which effectively deprive a fire of oxygen while mitigating collateral damage to equipment. As the data center is also a critical business asset, comprehensive security measures, such as badge access and video surveillance, help detect and prevent embezzlement by employees, contractors and intruders.
Management and monitoring of Data Center infrastructure
Modern data centers make a intensive use of monitoring and management software. This software, which includes data center infrastructure management tools, allows remote IT administrators to monitor facilities and equipment, measure performance, detect failures and implement a wide range of corrective actions without ever physically entering the data center room.
The development of virtualization has added another important dimension to data center infrastructure management. Virtualization now abstracts servers, networks, and storage, allowing each computing resource to be organized into pools without regard to their physical location. Administrators can then provision workloads, storage instances, and even network configuration from these common resource pools. When administrators no longer need these resources, they can return them to the pool for reuse. Everything that network, storage, and server virtualization accomplishes can be implemented through software, giving importance to the term “software-defined data center.”
Data Center vs. Cloud
Data centers are increasingly implementing data center software. private cloud, which are based on the virtualization to add a level of automation, user self-service and billing/chargeback to data center administration. The goal is to enable individual users to provision workloads and other computing resources on demand without intervention from IT administration.
It is also increasingly possible for data centers to interface with public cloud providers. Platforms such as Microsoft Azure emphasizehybrid use of on-premises data centers with Azure or other public cloud resources. The result is not the elimination of data centers, but rather the creation of a dynamic environment that allows organizations to run workloads locally or in the cloud or move these instances to or from the cloud as they want it.
History
The origins of the first data centers date back to the 1940s and the existence of the first computer systems such as ENIAC (Electronic Numerical Integrator and Computer). These early machines were complex to maintain and operate and had a multitude of cables connecting all the necessary components. They were also used by the military – meaning specialist computer rooms with racks, cable trays, cooling mechanisms and access restrictions were required to accommodate all the equipment and implement security measures. appropriate safety.
However, it was not until the 1990s, when computer operations began to become more complex and inexpensive networking equipment became available, that the term « data center » first appeared. It has become possible to store all of a company’s necessary servers in one room in the company. These specialized computer rooms have been called « data centers » within companies, and the term has grown in popularity.
During the dotcom bubble era of the late 1990s, the need for internet speed and a constant internet presence for businesses necessitated larger facilities to house the amount of networking equipment needed. It was around this time that data centers became popular and began to resemble those described above.
Throughout the history of computing, as computers became smaller and networks larger, the data center evolved and transformed to accommodate the necessary technology of the time.