Every DataOps initiative starts with a pilot project. How do you choose a project that matters to people?
DataOps addresses a broad set of use cases because it applies workflow process automation to the end-to-end data-analytics lifecycle. DataOps reduces errors, shortens cycle time, eliminates unplanned work, increases innovation, improves teamwork, and more. Each of these improvements can be measured and iterated upon.
These benefits are hugely important for data professionals, but if you made a pitch like this to a typical executive, you probably wouldn’t generate much enthusiasm. Your data consumers are focused on business objectives. They need to grow sales, pursue new business opportunities, or reduce costs. They have very little understanding of what it means to create development environments in a day versus several weeks. How does that help them “evaluate a new M&A opportunity by Friday?”
If you pitch DataOps in terms of its technical benefits, an executive or co-worker might not understand its full potential value. Instead, explain how agile and error-free analytics serves the organization’s mission. What would it mean to monetize data more effectively than competitors? Data is the modern business decision apparatus (just ask Google, Target, Amazon, or Facebook). DataOps enables companies to rapidly assess and pursue opportunities, avoiding strategic mistakes, and shrinking time-to-market. What would it mean for a company to lead its industry in savvy and business agility? When discussing a DataOps initiative with an executive or colleague, focus on his/her top business objective and find a project related to it. Impactful DataOps projects are those that help colleagues and executives pursue their objectives. Below we suggest some additional unconventional approaches to finding high-visibility DataOps projects.
Find Unhappy Analytics Users
A strained relationship between the data team and users can point to a potential DataOps pilot project. A data team with unhappy users is ripe for transformational change. You may instinctively wish to turn away from grumbling users. You should be thankful for them. The more vocal and unhappy the customers are, the bigger the opportunity to turn the situation around and bring high-impact improvements to the broadest possible group. A large community of dissatisfied customers is also likely to be a higher priority for managers and executives. Ask your unhappy customers or colleagues what concerns them most about the data-analytics team. User discontent may be expressed in feelings and observations. User surveys can organize and quantify user anecdotes into actionable priorities. The list of possible issues is long, but you might hear feedback that includes:
- Data science/engineering/analytic teams do not deliver the insight that the business customers need.
- The data team takes too long to deliver analytics.
- Users mistrust the data itself or the team working on the data.
- Stakeholders have hired consultants or shadow teams to do data work.
Be Grateful for Negative Feedback
Negative feedback often stems from deep, underlying issues. The data team may not deliver relevant analytics because business users and data analysts are isolated from each other. Users may mistrust data and analytics because of errors. When business units hire their own data analysts, it’s a sign that they are underserved. They may feel like the data organization is not addressing their priorities.
User feedback may feel concrete to users, but as a data professional, you will have to translate these requirements into metrics. For example, users may not trust the data. That may seem abstract and not directly actionable. Try measuring your errors per week. If you can show users that you are lowering that number, you can build trust. A test coverage dashboard can illustrate progress in quality controls. Demonstrating your success with data can help gradually win over detractors. What other problems have eroded trust? You may need to look for more than one contributing factor.
In many organizations, analytics follows a complex path from raw data to processed analytics that create value. Your data crosses organizational boundaries, data centers, teams, and organizations. Errors can creep in anywhere along this path. What are the historical drivers of issues/errors? Which teams own each part of the process? A lack of responsiveness sometimes squanders trust. Measure how fast teams can respond to errors and requests.
Another common user complaint is that data-analytics teams take too long to deliver requested features. The length of time required to deliver analytics can be expressed in a metric called cycle time. Benchmark how fast you can deploy new ideas or requests into production. To reduce cycle time, examine the data science/engineering/analytic development process. For example, how long does it take to create a development environment? How up-to-date are development environments? How well-governed are development environments?
Creating a Feedback Loop of Trust
As DataOps improves trust in data and data-team responsiveness, business users will naturally begin to work more closely with the data team. As the data team becomes more agile, interaction with users increases in importance. DataOps focuses on delivering value to customers in short, frequent iterations. The value that business users receive after interacting with the data team reinforces the value of working together. DataOps enterprises frequently observe greater and more frequent communication and collaboration between users and the data team. The positive feedback loop of collaboration and value creation encourages users and data professionals to invest in working closely together. In the end, the quality of collaboration that DataOps fosters becomes the engine that takes an organization to new heights.