A summary of Gartner’s Recent DataOps Report and recommendations

by | Nov 1, 2019 | Blog, DataOps Principles

A summary of Gartner’s recent DataOps report and recommendations on further reading:

Gartner Research “Introducing DataOps Into Your Data Management Discipline”

Published: 31 October 2019 ID: G00376495

Analyst(s): Ted Friedman, Nick Heudecker

Summary

“To relieve bottlenecks and barriers in delivery of data and analytics solutions, organizations need to… introduce DataOps techniques in a focused manner, data and analytics leaders can affect a shift toward more rapid, flexible and reliable delivery of data pipelines.”

Key Challenges

  1. “speed and reliability of project delivery they desire because too many roles, too much complexity and constantly shifting requirements” – see our analytics at Amazon speed: https://www.www.datakitchen.io/high-velocity-data-analytics-with-dataops.html

  2. “In most organizations, this complexity is exacerbated by limited or inconsistent coordination across the roles involved in building, deploying and maintaining data pipelines.” At DataKitchen we’ve written a lot about what collaboration means in DataOps: Intra-team coordination DataOps Teamwork (Aug 2019) Inter-team coordination:  Warring Tribes (April 2019)  and Centralization vs. Freedom (Oct 2018)

  3. “Data and analytics leaders often have difficulty determining the optimal pace of change when introducing new techniques.” This creates problems that we recently that captures in our high-level views:  What is DataOps? And What is DataOps – Top Ten Questions

Recommendations:

  1. “Enable greater reliability, adaptability and speed by leveraging techniques from agile application development and deployment (DevOps) in your data and analytics work”: https://www.www.datakitchen.io/dataops-vs-devops.html

  2. Increased deployment frequency —rapid and continuous delivery of new functionality https://www.www.datakitchen.io/analytics-at-amazon-speed.html;

  3. Automated testing — don’t manual test. It create an error fiesta https://medium.com/data-ops/dataquality/home;

  4. Version control —tracking changes across all participants in data pipeline delivery https://medium.com/data-ops/the-best-way-to-manage-your-data-analytics-source-files-7559d48db693; and https://medium.com/data-ops/how-to-enable-your-data-analytics-team-to-work-in-parallel-e0cb2a5c2289

  5. Monitoring — constantly tracking behavior and testing of pipeline(s) in production https://medium.com/data-ops/how-data-analytics-professionals-can-sleep-better-6dedfa6daa08

  6. Collaboration across all stakeholders — “ … essential to speed of delivery” Enable collaboration across key roles (data engineer/scientist/visualization/governance, etc.) by including them in a common process. Warring Tribes (April 2019)  and Centralization vs. Freedom (Oct 2018).

  7. Complex data pipelines can involve all roles — and lack of consistent communication and coordination across them adds time and introduces errors DataOps Teamwork (Aug 2019)

Introducing these DataOps capabilities

“To avoid introducing too much change too quickly, data and analytics leaders can focus on a subset of the steps in the value chain, rather than immediately introducing new approaches in every step executed by every role“

Analytics teams need to move faster, but cutting corners invites problems in quality and governance. How can you reduce cycle time to create and deploy new data analytics (data, models, transformation, visualizations, etc.) without introducing errors? The answer relates to finding and eliminating the bottlenecks that slow down analytics development https://www.www.datakitchen.io/whitepaper-dataops-bottlenecks.html

How can you do this with design thinking in mind? https://medium.com/data-ops/enabling-design-thinking-in-data-analytics-with-dataops-4765bcbf8211

DataOps Uptake

Gartner end-user client inquiry data shows a 200% YoY increase in 2019 YTD on DataOps related questions

Sign-Up for our Newsletter

Get the latest straight into your inbox

Data Observability Software

DataOps Observability: Monitor every Data Journey in an enterprise, from source to customer value, and find errors fast! [Open Source, Enterprise]

DataOps TestGen: Simple, Fast Data Quality Test Generation and Execution. Trust, but verify your data! [Open Source, Enterprise]

DataOps Software

DataOps Automation: Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change. [Enterprise]

recipes for dataops success

DataKitchen Consulting Services


Assessments

Identify obstacles to remove and opportunities to grow

DataOps Consulting, Coaching, and Transformation

Deliver faster and eliminate errors

DataOps Training

Educate, align, and mobilize

Commercial Pharma Agile Data Warehouse

Get trusted data and fast changes from your warehouse

 

dataops-cookbook-download

DataOps Learning and Background Resources


DataOps Journey FAQ
DataOps Observability basics
Data Journey Manifesto
Why it matters!
DataOps FAQ
All the basics of DataOps
DataOps 101 Training
Get certified in DataOps
Maturity Model Assessment
Assess your DataOps Readiness
DataOps Manifesto
Thirty thousand signatures can't be wrong!

 

DataKitchen Basics


About DataKitchen

All the basics on DataKitchen

DataKitchen Team

Who we are; Why we are the DataOps experts

Careers

Come join us!

Contact

How to connect with DataKitchen

 

DataKitchen News


Newsroom

Hear the latest from DataKitchen

Events

See DataKitchen live!

Partners

See how partners are using our Products

 

Monitor every Data Journey in an enterprise, from source to customer value, in development and production.

Simple, Fast Data Quality Test Generation and Execution. Your Data Journey starts with verifying that you can trust your data.

Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change.