New DataKitchen Case Study: Banking on Data to Win in Financial Services
DataKitchen recently published a new banking case study. It talks about an innovative financial services company uses technology to outcompete traditional big banks and brokerage firms. With automation and data as a key competitive advantage, the company continually seeks ways for data operations to become more efficient and effective. Like many other enterprises, their data pipeline is business critical. Data flows in from numerous sources into a data lake. Data is processed and then feeds into applications in analytics, trading operations, security and risk management. In an enterprise-critical environment such as this one, the company’s data team started by building a multi-tool data pipeline relying upon the open source tool Apache Airflow, to schedule and monitor workflows, Apache Kafka, cloud services, Python, SQL, Domo, and numerous different databases – then they ran into trouble.