Analytics are prone to frequent data errors and deployment of analytics is slow and laborious. The strategic value of analytics is widely recognized, but the turnaround time of analytics teams typically canโt support the decision-making needs of executives coping with fast-paced market conditions. Perhaps it is no surprise that the average tenure of a CDO or CAO is only about 2.5 years.
When internal resources fall short, companies outsource data engineering and analytics. Here is where the loss of control begins. Thereโs no shortage of consultants who will promise to manage the end-to-end lifecycle of data from integration to transformation to visualization.ย
The challenge is that data engineering and analytics are incredibly complex. Large enterprises integrate hundreds or thousands of asynchronous data sources into a web of pipelines that flow into visualizations and purpose-built databases that support self-service analysis.
Outsourcing doesnโt eliminate complexity, it just relocates the responsibility for it. Ensuring that data is available, secure, correct, and fit for purpose is neither simple nor cheap. Companies end up paying outside consultants enormous fees while still having to suffer the effects of poor data quality and lengthy cycle time.ย
In one outsourcing case, a company hired a partner consulting firm that recruited young engineers and integrated them into customer account teams. Data skills are in high demand so the consulting firm turned into a revolving door for talent. The consulting firm hired junior engineers, who gradually evolved into senior engineers and were out the door within 18-24 months. When an especially gifted data engineer decided to leave, it left the team scrambling to support routine functions.
The dynamic nature of the consulting team meant that architectural decisions made at the data engineering level were often short-sighted and incoherent. The company incurred technical debt as consultants grafted one manually-driven exception process on top of another to adapt to evolving business requirements. Over time, the complexity grew to such an extent that no one person understood the whole system. The sales team at the consulting firm proposed that a bigger budget was needed to keep the data factory churning out enterprise-critical analytics.
The data requirements of a thriving business are never complete. There is an endless stream of new data sources to integrate, exceptions to manage and requests for new charts, graphs and dashboards. To cope with all of the complexity, the company had to hire more and more consultants each year to engineer and analyze the data. It ended up costing tens of millions of dollars annually.
Eventually, the company in our example found a way out of its bind. They used a method called DataOps to reduce their consulting budget while simultaneously improving the responsiveness of their team and the quality of their analytics. DataOps improves the robustness, transparency and efficiency of data workflows through automation. For example, DataOps can be used to automate data integration. Previously, the consulting team had been using a patchwork of ETL to consolidate data from disparate sources into a data lake. DataOps converted these manual processes into automated orchestrations that only required human intervention when an automated alert detected that a data source missed its delivery deadline or failed to pass quality tests. In data analytics, automated orchestrations can handle data operations, testing, observability, data integration and all manner of data pipelines. DataOps can also automate analytics development processes such as the creation of sandbox environments, provisioning of test data, quality assurance and deployment. When a job is automated, there is little advantage to outsourcing.ย
When consultants market their services, they like to position themselves as the smartest people in the room. They may subtly imply that the internal data team is less adept and behind the times. They may be right or wrong, but it doesnโt matter. The focus on star engineers is a red herring. In DataOps, the process is more important than the person.
At DataKitchen we have a staff of talented data engineers who provide DataOps services to customers. As skilled as they are, without DataOps methods they would not have nearly as much positive impact on our customers. The productivity impact that organizations receive flows from the DataOps process that enables the data teams to work quickly and intuitively: workflow agility, automation, testing and observability. DataOps enables our clients to rapidly deliver impactful insights to their enterprise, helping them stay focused on adding value to business customers.
DataOps enables data analytics workflows to be performed inexpensively via automated orchestrations that are developed and managed in-house. Automation frees up both direct and indirect resources. This may mean reducing the consulting budget. It definitely means redeploying internal and outsourcing budgets to higher value-add activities.
DataOps also restores an enterpriseโs control over its intellectual property. It transfers knowledge of the companyโs precious data processes from inside the heads of data experts to orchestration scripts and code which are stored and maintained in source control. When a star data engineer moves on, they leave behind a robust and repeatable process, in the form of code, that can be maintained and improved by others.
DataOps automation enables companies to redirect the utilization of their own staff and reduce their dependency on external resources. If your company spends millions on consulting fees and outside contractors, DataOps automation could make a significant contribution to the bottom line.ย