There were many other considerations too, not least of which is the role of data in AI and how failure to provide the right data and the right amounts of data will ultimately lead to failure. The problem, though, is more than just about quantities of data and feeding them into the right systems, it’s also about using data in an enterprise setting, which even in ideal circumstances is problematic.
Mario Blandini is CMO and chief evangelist at Santa Clara, Calif.-based Tintri. He pointed out that traditional enterprise storage environments do not provide, or make easily accessible, a full scope of data, dismissing pivotal data that could drive the business. Most rely on standard infrastructure, where the QOS must be monitored, groomed, and formatted actively as data comes in. QOS (Quality of Service) are technologies used to manage bandwidth usage as data crosses computer networks. With the ever-growing infrastructure to be managed and not enough technicians to keep up, the result is the inability to access a complete network of computing and storage data in real time.
Enterprise storage has also evolved to include virtualized workloads and commodity flash storage — ultimately revolutionizing virtual desktop infrastructure, local and shared storage infrastructure. Many of the existing storage solutions cannot run virtual workloads efficiently, placing idle data on costly space and making valuable data harder to access. As a result, enterprises that do not utilize AIOps are coincidentally hindering their own success.
AIOps uses machine learning to complete tasks that better the business — such as making automatic recommendations to customers with a similar order history or automatically processing and analyzing data for predictive maintenance. AIOps bridges the gap by providing enterprises with the tools they need to easily access all their data and drive their business towards success. “Intelligent infrastructure helps organizations employ AI to automate infrastructure operations and advance real-time and predictive application analytics, which allows organizations to get the most value from their data, maintain the most efficient environment and adhere to best practices,” he said.
This is clearly a concern for many enterprises are they look for the best ways to monetize their data through AI or machine learning (ML). Gartner too has noted this and pointed to the growing trend by I&O leaders to search for more agile integrated systems.Last year, Gartner predicated that worldwide integrated systems revenue would hit $12.3 billion in 2018. Integrated systems are combinations of server, storage and network infrastructure, sold with management software that facilitates the provisioning and management of the combined unit. It can be used to derive the benefit of an architected design and deployment of integrated compute, storage and memory infrastructure to support digital business.
This is not the only problem – keeping systems up to data once they have been deployed is also a major issue, which AI can help resolve. AI and Data Science are all the rage, Chris Bergh, CEO of Cambridge, Mass.-based DataKitchen but there is a problem that no one talks about. “Machine learning tools are evolving to make it faster and less costly to develop AI systems. But deploying and maintaining these systems over time is getting exponentially more complex and expensive,” he said. “Data science teams are incurring enormous technical debt by deploying systems without the processes and tools to maintain, monitor and update them. Further, poor quality data sources create unplanned work and cause errors that invalidate results.”
While AI and data science tools improve the productivity of model development, the actual ML code is a small part of the overall systems solution. Data science teams that don't apply modern software development principles to the data lifecycle can end up with poor quality and technical debt that causes unplanned work-rendering all the efforts behind AI deployments counterproductive.
There is a solution to this. DataOps offers a new approach to creating and operationalizing AI that minimizes technical debt, reduces cycle time and improves code and data quality. It is a methodology that enables data science teams to thrive despite increasing levels of complexity required to deploy and maintain AI in the field.
The orchestration of the development, deployment, operations and monitoring tool chains dramatically simplifies the daily workflows of the data science team. Without the burden of technical debt and unplanned work, they can focus on their area of expertise; creating new models that help the enterprise realize its mission.
There is one further problem that the use of large data pools creates and which AI is in a position to resolve. As enterprises continue to connect internal systems to realize better internal operational efficiencies and deliver better customer experiences, they are further exposing their sensitive data to criminal hackers who can gain access to all those systems from a single weak access point, Eyal Elyashiv, CEO of Washington DC-based Cynamics, told CMSWire. “AI is widely regarded to be the next weapon in the battle between hackers and the cybersecurity solutions that are built to protect against them, and if enterprises aren't leveraging AI to enable full visibility into their network as they become much larger in scale it will leave them vulnerable to the ransomware attacks that are plaguing both the public and private sector,” he said.
AI is essential to offer cost-effective cybersecurity protection for enterprises at a scale they will require shortly. Aside from the cybersecurity needs, AI will also be helpful in locating areas of the network to optimize performance in real-time, which will help reduce overhead and operational costs of the business.
As a final thought, most enterprises do not need to pursue AI strategies, Jesse Rio Russell, president of Madison, Wis.-based Big Picture Research and Consulting, told us. He said that what enterprises need to do is to avoid focusing on any one approach, or any one platform, or any one new technology.
Instead, he said, they should focus on building a culture of data. A culture of data where everyone in the enterprise can speak a common language around how language gets used in the company, a language for how date intersects with their own work, and a language for how data helps the enterprise serve its customers better. “A culture of data takes data out of the back room down the hall where mysterious data wizards work, and spreads the power of data to everyone,” he said. “Once this kind of culture of data exists, then a company can start thinking about what technology to invest in, what platform to use, what questions to answer. Does every enterprise need AI — the answer is no. And only enterprises with an existing culture of data should even consider it.”