Webinar: A Guide to the Six Types of Data Quality Dashboards

In an exciting webinar, we discuss the six major types of Data Quality Dashboards

In this exciting webinar, Christopher Bergh discussed various types of data quality dashboards, emphasizing that effective dashboards make data health visible and drive targeted improvements by relying on concrete, actionable tests. He highlighted the importance of selecting dashboard types based on the data landscape and stakeholder needs, advocating for an iterative approach and showcasing their open-source software.

Core Philosophy of Data Quality Improvementย ย 

Bergh shared his viewpoint that an empowered individual drives data quality improvement, focusing on a specific customer, starting small, iterating with data operations principles, and providing concrete remediation actions. He stressed the importance of measuring quality to demonstrate value and extend influence.

The Significance of Data Quality Dashboardsย ย 

Bergh explained that dashboards are essential for making the often invisible health of data visible and enabling targeted improvements. He mentioned that various terms, such as reports or scorecards, exist, and effective dashboards should align with stakeholder needs and the data landscape.

Six Types of Data Quality Dashboardsย ย 

Bergh introduced six types of data quality dashboards: dimension-focused, critical data element (CDE)-focused, business goal-focused, data source-focused, data consumer-focused, and ticket-driven workflow. Each type serves a unique role in driving changes in data quality.

    1. Dimension-Focused Dashboardsย ย 

ย ย ย Bergh described these dashboards as providing a broad overview of data quality across standard categories, facilitating consistent evaluations. However, he noted challenges such as being too abstract, failing to align with stakeholder needs, and not resonating with business priorities.

    1. Critical Data Element (CDE)-Focused Dashboards

ย ย ย Bergh explained that CDE dashboards are essential in regulated industries, focusing on critical identifiers and financial values to ensure compliance and mitigate risk. They are effective when CDEs are clearly defined and directly connect to business needs and regulatory mandates.

    1. Business Goal-Focused Dashboardsย ย 

ย ย ย Bergh highlighted that aligning data quality with specific organizational objectives, such as customer acquisition or cost reduction, can drive change by demonstrating the impact of data quality on business outcomes. This approach makes data quality more strategic and relevant to business stakeholders.

    1. Data Source-Focused Dashboardsย ย 

ย ย ย Bergh discussed how tracking data quality metrics by source can foster accountability among suppliers and help data teams identify and address problematic origins. He emphasized the importance of providing specific feedback to suppliers to encourage improvement.

    1. Data Consumer-Focused Dashboardsย ย 

ย  Bergh explained that dashboards tailored to data consumers, such as data scientists relying on specific data elements for modeling, can help them advocate for better data quality. By focusing on their needs, data consumers can become allies in driving data quality improvements.

    1. Ticket-Driven Workflow as a “Dashboard”ย ย 

ย ย ย Bergh presented the idea of using tickets to measure and track progress based on the number of tickets created and resolved, as a method to address data quality issues. This shifts the focus to actionable tasks and operational progress, although it requires well-defined issues and a ticketing system.

Framework for Choosing Dashboard Typesย ย 

Bergh provided guidance on selecting the appropriate dashboard type based on factors such as the data landscape, the clarity of CDE definitions, the presence of clear business goals, the nature of data sources, the existence of vocal data consumers, and the suitability of a ticket-driven approach. He advocated for multi-dashboard strategies to address the diverse needs of various stakeholders.

Actionability of Data Quality Dashboardsย ย 

Bergh stressed that effective data quality dashboards must be based on concrete actions derived from data quality tests or hygiene test results. Scores should be traceable to these underlying actions, enabling users to understand what needs to be fixed.

The Equivalence Principleย ย 

Bergh introduced the concept that data quality test results, workflow tickets, and a dashboard are interconnected. A dashboard can be viewed as a collection of potential tickets, and each ticket should be supported by tests, ensuring actionability and traceability to its root cause.

Iterative Approach to Building Dashboardsย ย 

Bergh recommended an agile, iterative approach to building data quality dashboards, emphasizing the need for constant refinement and responsiveness to stakeholder feedback.

The full webinar is available here.


For those who enjoy reading, we also have a blog post that delves into the ideas in more detail.

 

Sign-Up for our Newsletter

Get the latest straight into your inbox

DataOps Data Quality TestGen:

Simple, Fast, Generative Data Quality Testing, Execution, and Scoring.

[Open Source, Enterprise]

DataOps Observability:

Monitor every date pipeline, from source to customer value, & find problems fast

[Open Source, Enterprise]

DataOps Automation:

Orchestrate and automate your data toolchain with few errors and a high rate of change.

[Enterprise]

recipes for dataops success

DataKitchen Consulting Services


DataOps Assessments

Identify obstacles to remove and opportunities to grow

DataOps Consulting, Coaching, and Transformation

Deliver faster and eliminate errors

DataOps Training

Educate, align, and mobilize

Commercial Data & Analytics Platform for Pharma

Get trusted data and fast changes to create a single source of truth

 

dataops-cookbook-download

DataOps Learning and Background Resources


DataOps Journey FAQ
DataOps Observability basics
Data Journey Manifesto
Why it matters!
DataOps FAQ
All the basics of DataOps
DataOps 101 Training
Get certified in DataOps
Maturity Model Assessment
Assess your DataOps Readiness
DataOps Manifesto
Thirty thousand signatures can't be wrong!

 

DataKitchen Basics


About DataKitchen

All the basics on DataKitchen

DataKitchen Team

Who we are; Why we are the DataOps experts

Careers

Come join us!

Contact

How to connect with DataKitchen

 

DataKitchen News


Newsroom

Hear the latest from DataKitchen

Events

See DataKitchen live!

Partners

See how partners are using our Products

 

Monitor every Data Journey in an enterprise, from source to customer value, in development and production.

Simple, Fast Data Quality Test Generation and Execution. Your Data Journey starts with verifying that you can trust your data.

Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change.