Banking On Data (Ops) White Paper

by | Jun 9, 2019 | Blog

New DataKitchen Case Study: Banking on Data to Win in Financial Services

DataKitchen recently published a newย banking case study. It talks about an innovative financial services company uses technology to outcompete traditional big banks and brokerage firms. With automation and data as a key competitive advantage, the company continually seeks ways for data operations to become more efficient and effective. Like many other enterprises, their data pipeline is business critical. Data flows in from numerous sources into a data lake. Data is processed and then feeds into applications in analytics, trading operations, security and risk management. In an enterprise-critical environment such as this one, the companyโ€™s data team started by building a multi-tool data pipeline relying upon the open source tool Apache Airflow, to schedule and monitor workflows, Apache Kafka, cloud services, Python, SQL, Domo, and numerous different databases โ€“ then they ran into trouble.

Sign-Up for our Newsletter

Get the latest straight into your inbox

Open Source Data Observability Software

DataOps Observability: Monitor every Data Journey in an enterprise, from source to customer value, and find errors fast! [Open Source, Enterprise]

DataOps Data Quality TestGen: Simple, Fast Data Quality Test Generation and Execution. Trust, but verify your data! [Open Source, Enterprise]

DataOps Software

DataOps Automation: Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change. [Enterprise]

recipes for dataops success

DataKitchen Consulting Services


Assessments

Identify obstacles to remove and opportunities to grow

DataOps Consulting, Coaching, and Transformation

Deliver faster and eliminate errors

DataOps Training

Educate, align, and mobilize

Commercial Pharma Agile Data Warehouse

Get trusted data and fast changes from your warehouse

 

dataops-cookbook-download

DataOps Learning and Background Resources


DataOps Journey FAQ
DataOps Observability basics
Data Journey Manifesto
Why it matters!
DataOps FAQ
All the basics of DataOps
DataOps 101 Training
Get certified in DataOps
Maturity Model Assessment
Assess your DataOps Readiness
DataOps Manifesto
Thirty thousand signatures can't be wrong!

 

DataKitchen Basics


About DataKitchen

All the basics on DataKitchen

DataKitchen Team

Who we are; Why we are the DataOps experts

Careers

Come join us!

Contact

How to connect with DataKitchen

 

DataKitchen News


Newsroom

Hear the latest from DataKitchen

Events

See DataKitchen live!

Partners

See how partners are using our Products

 

Monitor every Data Journey in an enterprise, from source to customer value, in development and production.

Simple, Fast Data Quality Test Generation and Execution. Your Data Journey starts with verifying that you can trust your data.

Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change.