Artificial intelligence (AI) is no longer the stuff of imagination and hopes. It now spans beyond science-fiction, and has become so accessible that numerous industries, ranging from health care to marketing, realize AI could help meet goals.
It also makes sense to use AI in some specialized facilities. It’s already being applied in data centers to make them better, and more use cases will likely emerge that detail these improvements. Here are some examples of why AI is a good fit for data centers.
Artificial intelligence could make data centers less prone to cybersecurity threats
Due to the number of clients that may have leased equipment in a single data center, cyberattacks on those facilities could have exceptionally severe consequences. One attack could bring down the websites of hospitals, hotel booking sites, e-commerce stores and more.
It could even affect some nations’ governments. One relatively recent case happened to the data center associated with a Central Asian country. Security analysts believe the hack allowed the culprits to access sensitive data for several months, and that those involved proceeded to insert malicious scripts on the country’s websites.
AI can help detect strange activity that may indicate attempted data center attacks. The fact that AI can get smarter with use is especially helpful, since an average of 40 new vulnerabilities were discovered daily in 2017. It’s still necessary for humans to analyze the findings of an AI tool and decide how to proceed from there, but such technology can assist people by reducing the chances that data center vulnerabilities get overlooked.
AI helps data centers save on cooling costs
Keeping data centers cool enough is crucial to protect the equipment inside. It’s not surprising that managers investigate ways to depend on AI to cut expenses related to data center climate control.
Even in these early stages, efforts of this kind at Google are going well. Starting in 2016, Google worked on an AI cooling system that made decisions about a data center’s temperature before humans confirmed them. This system kept people involved by requiring them to assess whether the AI’s conclusion was correct.
A couple of years later, that approach improved by taking humans out of the equation. How does the AI work? It analyzes data from thousands of physical sensors and figures out a confidence rating for each possible action to adjust the climate control. Those with low degrees of confidence automatically get eliminated.
According to Google, the system showed an average of 30 percent improvement in energy savings after only a matter of months — and the company expects it to improve over time.
AI can make service outages become less likely
Outages can be extremely costly for data center operators, but AI can help make downtime less problematic. A survey of 200 companies showed that downtime results in losses surpassing $26.5 billion, with the cost per minute of a network outage reaching approximately $7,900.
Beyond the financial damage caused, outages can have negative impacts on customer perceptions. People expect websites to work consistently and load within seconds. When those things don’t happen, users can become disgruntled. It’s especially likely for that to happen when they use websites for time-sensitive needs, such as to buy holiday presents or to get concert tickets for an internationally known artist before they sell out.
AI can spot patterns that may mean an increased likelihood of outages. It then alerts data center workers and recommends they intervene before the downtime happens. Some AI platforms even make automatic adjustments in real-time as conditions change, offering another safeguard against issues.
San Francisco–based AdeptDC uses AI technology to cut down on outages similarly to the way Google depends on it to manage climate control. Instead of only measuring temperature-related aspects, the technology tracks performance stats by collecting data from components such as server power sources and fans. Thanks to that information, companies can make better decisions about when to take care of maintenance before equipment fails.
AI supplements or reduces on-site staff
AI can also assume some of the responsibilities normally handled by on-site staff. That option gave rise to the term AIOps, which relates to all uses of artificial intelligence for IT operations.
Applying AIOps to the data center allows connecting all the aspects of a facility’s infrastructure, giving much-needed context to current conditions and how they compare to past metrics about performance.
CBRE, which manages more than 800 data centers, is training a cloud-based AI tool to recognize the normal operating conditions of its facilities. If the technology works as intended, this project could cut down on the staff members needed at a given site or allow the data centers to operate with no one present. Then, the operations could remain more consistent without making worker input necessary.
Other efforts to deploy unmanaged facilities are underway, including Microsoft’s Project Natick, which involves an underwater data center. It should function for years without frequent maintenance.
Modern data centers are getting progressively larger, which means people have to think creatively when choosing the appropriate sites to build them. Using AI to achieve less dependence on human workers might allow more deployments in remote areas that people cannot easily reach.
Lots of progress likely on the horizon
Some of the projects mentioned here are still in the preliminary stages, which means people should not limit themselves when thinking about the possibilities.
It’s clear that AI offers substantial potential for improved data center management. The results it gives could benefit operators and clients alike, plus help brands that utilize them gain competitive advantages in an increasingly crowded marketplace.