Technology is advancing at an exponential rate, bringing about advancements on par with science fiction novels written years ago. The fourth industrial revolution is here, and whether we like it or not, the world we live in is changing rapidly. So that leaves us wondering, “What is driving this rapid evolution?” The answer: “AI and Machine learning”.

Step 1: Definition
Artificial Intelligence (AI) is defined as when a computer is coded to think like a human and mimic their actions. Don’t worry, this will not land us in an “Ex Machina” or “Blade Runner” situation as some people believe. The machine only runs programs written into its design, so there will be no robot world takeover. Now, Machine learning comes into play when complex algorithms enable the AI to learn from data collected over time and make calculated predictions as a result. This process is how AI becomes efficient in different tasks.

That is the basic concept, however in reality the machine is only as good as the data it receives. Just as the old programmer saying goes, “garbage in… garbage out.” Before you can make use of AI and machine learning in your data center, someone needs to understand the software and how it relates to the work environment you wish to optimize. Then, the developer must train a machine learning system to recognize the patterns it needs to know and how they are applied to the task at hand. A simple example of this would be a data input task. The machine would “watch” a data analyst complete data inputs over a period of time. Throughout that period, the AI would learn the patterns necessary to input the data correctly. Then, it would be able to effectively do that job. Technologies are currently available that can assist in the collection of this data; products such as Intelligent (rack level) and Power Distribution Unit (PDU) have built in controllers that can monitor and control power to the individual outlets and incorporate features that allow for connections to other PDU’s, environmental sensors and peripherals.

Step 2: Implementation
These technologies could ease transitions from the traditional to AI approach, otherwise known as a “lights out” data center. By leveraging digital mapping engines integrated into existing camera software and overlaying design/construction data, materials, or schedules, this technology could streamline inventory management and scheduling processes. A company could present product/inventory information from multiple warehouses relevant to the specific project location, putting information streamed from a single point of hard data directly to the operations engineers’ phone or tablet. This would also help seamlessly incorporate remote site monitoring, enabling the tracking of live equipment operational status, alarms, personnel/equipment location, deliveries, and all sorts of other useful information.

The downside currently of this ever expanding technology is that AI, ironically, uses an enormous amount of  processing power. The Department of Energy estimates that data centers account for about 2 percent of total US electricity usage. Worldwide, data centers consume about 200 terawatt hours of power per year, and experts predict that number to grow significantly over the next decade. Some say that by 2030, computing and communications technology will consume between 8 percent and 20 percent of the world’s electricity, with data centers accounting for a third.

Approximately 10% of data center operating expenditure (opex) is power, with estimations of a 15% increase into 2021. One major reason for this is the amount of heat generated form the computers running these advanced programs. This will require significant resources being committed to keeping the facility cool. In the past, businesses have utilized many different cooling strategies or chosen locations with lower temperatures.

Now, however, AI can help with this as well. In 2018, Google, one of the largest data center operators on the planet, turned over control of its data center cooling to their DeepMind AI. This resulted in a 40% decrease in energy used for cooling. Specifically, Google trained the AI to compare the ratio of total building energy usage to the energy used by IT equipment. Two other AI’s were also trained to predict the buildings future temperature and pressure over the following hour. This tracked the environment in which the computers would be running and enabled the systems to optimize cooling/energy consumption for the specific actions being conducted at any one time. The recommended optimizations were fed back to the data center’s control system, which implemented the required changes accordingly to ensure optimum operating parameters were met.

Step 3: The Future
It is time to modernize the way businesses utilize and control their data. Are you ready to let AI and machine learning take the wheel?  If you haven’t decided yet, it is something I urge you to consider. It seems to be a necessity for the future, an evolution of modern business standards. In the next article I will discuss how data centers can draw guidance from other industries to make the best use of this growing trend to apply AI in every aspect of our operations.