Gartner: “Select a DataOps Platform for 2020”

by | Mar 3, 2020 | Blog, DataOps Principles

In Gartner’s recent write up for technical professionals, “2020 Planning Guide for Data Management“, they highlight 10 critical trends for 2020. The most exciting one for us at DataKitchen is #10, DataOps.  


We are most excited that Gartner is advocating the need for a DataOps Platform: “Select a DataOps platform to integrate functions that support rapid deployment and governance.” As a provider of a DataOps platform, we are biased, but could not agree more!

For the first time, Gartner describes the architectural components of a DataOps Platform. You can learn more by following this LinkedIn post and reading the the 2020 Planning Guide for Data Management.

The document expresses ideas similar to what we have written in about in our blog and white papers. I wanted to highlight a few key points and use them as jumping-off points to our writings and features that we have built-in our product.

  • “…, test, monitor, … and alert for failure … Detect data drift automatically”:  
  •   “Capture metrics and reports at the right granularity”:
  • “Provide Version Control”: 
    • Please read about our view of this here.
    • Our product supports version control natively
  • “Manage .. metadata … and activity logs”:  
    • Keeping data about data is central to moving fast
    • We built that in our product as Orders.
  • “Automate deployment of the code, frameworks, libraries and tools”
  • “Create and maintain the environment … “
    • We think environments that teams work in are central ideas for collaboration and coordination.
    • A considerable part of our product is Kitchens. We are DataKitchen, remember?
  • “DataOps can span the entire gamut from data ingestion all the way to data delivery. This is a complex chain of components.”   

We do feel very proud that one of our articles on DataOps Architecture has had some influence on Gartner. Check out the similarities below (one small nit:  Gartner mixed up the order of Dev and QA environments).


Finally, we do have a bone to pick with the authors at Gartner. They said:  “And because this is a nascent process, there are no end-to-end tools or products that provide an all-encompassing solution for DataOps.” 

WE DISAGREE .   We have built an end-to-end product for DataOps here at DataKitchen! 

Sign-Up for our Newsletter

Get the latest straight into your inbox

Open Source Data Observability Software

DataOps Observability: Monitor every Data Journey in an enterprise, from source to customer value, and find errors fast! [Open Source, Enterprise]

DataOps TestGen: Simple, Fast Data Quality Test Generation and Execution. Trust, but verify your data! [Open Source, Enterprise]

DataOps Software

DataOps Automation: Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change. [Enterprise]

recipes for dataops success

DataKitchen Consulting Services


Identify obstacles to remove and opportunities to grow

DataOps Consulting, Coaching, and Transformation

Deliver faster and eliminate errors

DataOps Training

Educate, align, and mobilize

Commercial Pharma Agile Data Warehouse

Get trusted data and fast changes from your warehouse



DataOps Learning and Background Resources

DataOps Journey FAQ
DataOps Observability basics
Data Journey Manifesto
Why it matters!
DataOps FAQ
All the basics of DataOps
DataOps 101 Training
Get certified in DataOps
Maturity Model Assessment
Assess your DataOps Readiness
DataOps Manifesto
Thirty thousand signatures can't be wrong!


DataKitchen Basics

About DataKitchen

All the basics on DataKitchen

DataKitchen Team

Who we are; Why we are the DataOps experts


Come join us!


How to connect with DataKitchen


DataKitchen News


Hear the latest from DataKitchen


See DataKitchen live!


See how partners are using our Products


Monitor every Data Journey in an enterprise, from source to customer value, in development and production.

Simple, Fast Data Quality Test Generation and Execution. Your Data Journey starts with verifying that you can trust your data.

Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change.