We are rapidly hurtling toward the reality of connecting more than a trillion smart and often autonomous devices through a global network we call the Internet of Things (IoT). These devices are being deployed everywhere—on factory floors, on public malls, on airports, on railway stations, on self-service kiosks, on hospitals.
This is creating massive amounts of data at the edge and starting to redefine the way we store it, manage it, secure it, and make sense of it. The reality is that, today, only a very small percentage of IoT data (1 percent) actually gets analyzed and converted into business intelligence.
In the meantime, in the data center and public cloud, companies are struggling to manage very complex environments and to gain visibility not just over their data, but also threats. No matter the environment—data center, public cloud or edge—it is all just beyond human control.
As a result, Machine Learning (ML) and Artificial Intelligence (AI) are becoming critical for enterprises to be operationally and economically viable but, most importantly, more customer-oriented than ever. While data has been critical to businesses for the past years to drive personalization, with ML and AI there’s the exponential opportunity to micro-personalize experiences, products and services in a way we never thought possible. Just imagine if your bank could proactively provide you support from the time you open your account, to when you buy your first car, first home, or open a business, caring for you the entire life cycle of your relationship based on digital interaction. This will be possible when AI and ML perfect the art of recognizing patterns and behaviors, and accelerate the value chain from conceptualization to delivery.
The challenge is: how do you go about implementing and monetizing this value?
The best place to start is your own data center, an environment easier to control and protect. There’s often a perception that public cloud is the best starting point because of its flexibility and scalability capabilities. The reality is that many companies are realizing that not all data can be made public and that the pay-per-use model carries higher costs as data and consumption continue to ramp up. The trend now is to swing back to on-premises data centers for inference and leverage the public cloud for learning.
As companies come back to on-premises, it’s a great opportunity to rethink how they can modernize their Data Centers for ML and AI. Data powerhouses, such as finance and manufacturing companies, are emerging as the first ones to go in this direction and, I believe, they are in a great position to experiment—and succeed—with ML and AI.
As companies come back to on-premises, it’s a great opportunity to rethink how they can modernize their data centers for ML and AI
Experience and research show that a successful AI project depends mostly on how companies go about four dimensions, or challenges: data sourcing, infrastructure, talent and budget.
Data Quality & Sourcing
The very definition of machine learning is the ability to learn from data and progressively improve knowledge and autonomy by recognizing patterns and behaviors. As obvious as this may sound, a machine learning project will be as good as its data, hence its sources, quality, and relevancy based on business objectives are fundamental.
This helps to explain why very little companies are deploying ML and AI despite the benefits. And why, for example, there’s only one known successful chatbot in FSI in Asia, despite the buzz. Because businesses face difficulties with understanding which data, after training and inference, will generate the expected outcome.
Modernization of Infrastructure and Re-architecture of the Data Center
Another fundamental aspect as companies embark on an AI and ML project is digital readiness. These technologies imply a fundamental change in the way we architect it, characterized by the distribution of sensors and machines in a way that allows for the data to be accessible, prepared, trained, validated—and secure all the while.
At the same time, an AI and ML system implies changes in the network and security framework. Can the network keep up with number of connections needed for machine learning? Is the security framework ready to address a distributed model? These are questions that matter for an ML and AI project in the data center to work.
The pace at which technology is evolving is much faster than the pace at which talent with matching skills arrives in the marketplace. Universities and curriculums are adapting, but not enough yet to move the needle. This has created a massive talent problem for most companies nowadays as they look to adopt new technologies and digitize. ML and AI are certainly no exception, requiring a specialized know-how that, quite frankly, is almost nowhere to be found—and expensive, if you do.
The solution is to, first, work with your IT vendor, because they will understand the technologies and provide support. Secondly, look at your talent and retrain them. In doing so, think about the fact that you need different groups of talent and skills: one group that can actually manage and operate ML and AI, such as data engineers and data scientists, and another group that caters to them by ensuring infrastructure readiness.
Budget constraints are often appointed as the reason to move data and workloads to the cloud. But what companies may be saving in infrastructure and IT resources, they may be spending on outsourcing, new talent, or even bandwidth.
For the same reason, companies shy away from modernizing their data centers and investing in AI and ML. The reality is that it’s not that expensive. To implement a digital marketing initiative, companies are willing to invest five to 10 million dollars, but to get started you’d need just a few hundred thousand dollars to demonstrate the value of this use case of deploying an on-premises AI system to deliver product recommendation, experience personalization and even predict attrition based on analyzing digital customer conversations.
It’s often said this is the data economy. But is it really until ML and AI help to make sense of data? Maybe not and there is no better place to start than in your own data center.