This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
White Paper: A New, More Effective Approach To Data Quality Assessments Data quality leaders must rethink their role. They are neither compliance officers nor gatekeepers of platonic data ideals. They are advocates. Using their language and metrics, they must campaign for change, build coalitions, and show stakeholders why quality matters. This is not a theoretical shift; it is a practical one.
DataKitchen Is One Of The Coolest DataOps & Data Observability Companies of 2025 Were thrilled to share that DataKitchen has once again been named one of the Coolest DataOps & Data Observability Companies for 2025 by CRN! Its an honor to be recognized alongside such innovative leaders in the space. As the first company to define and deliver DataOps , were especially excited to see how this list continues to growproof that the movement we helped start is gaining momentum.
Welcome to the Data Quality Coffee Series with Uncle Chip Pull up a chair, pour yourself a fresh cup, and get ready to talk shopbecause its time for Data Quality Coffee with Uncle Chip. This video series is where decades of data experience meet real-world challenges, a dash of humor, and zero fluff. Uncle Chipaka Charles Bloche of DataKitchenhas spent his career deep in the trenches of data engineering, wrangling pipelines, building platforms, and navigating the all-too-familiar chaos of data qu
Welcome to the Data Quality Coffee Series with Uncle Chip Pull up a chair, pour yourself a fresh cup, and get ready to talk shopbecause its time for Data Quality Coffee with Uncle Chip. This video series is where decades of data experience meet real-world challenges, a dash of humor, and zero fluff. Uncle Chipaka Charles Bloche of DataKitchenhas spent his career deep in the trenches of data engineering, wrangling pipelines, building platforms, and navigating the all-too-familiar chaos of data qu
Data Quality When You Dont Understand the Data : Data Quality Coffee With Uncle Chip #3 Lets be honestdata quality feels impossible when you dont understand the data. And in large organizations, thats not a rare problem. Its the norm. Ive seen it firsthand: massive data estates maintained by teams who dont know what the numbers, strings, or categories in their tables really mean.
A New, More Effective Approach To Data Quality Assessments In DataKitchens recent webinar, CEO Christopher Bergh offered a compelling new approach to a timeless problem: how do we improve data quality in organizations where no one truly owns it, everyone needs it, and no one has the authority to enforce change? With clarity and a bit of dry humor, Bergh laid out the uncomfortable truth that improving data quality is less about perfect metrics and more about influence, agility, and a bit of perso
Why Data Quality Dimensions Fall Flat : Data Quality Coffee With Uncle Chip #2 In this playful yet pointed talk, Data Quality Coffee With Uncle Chip’ kicks things off by poking fun at the overcomplicated world of data quality dimensions. With so many dimensions and no consensus on definitions, vague terms like “accuracy” and “validity” just blur together.
Why Data Quality Isnt Worth The Effort : Data Quality Coffee With Uncle Chip Data quality has become one of the most discussed challenges in modern data teams, yet it remains one of the most thankless and frustrating responsibilities. In the first of the Data Quality Coffee With Uncle Chip series, he highlights the persistent tension between the need for clean, reliable data and its overwhelming complexity.
Improving data quality can feel overwhelming. There are so many things to fix and so many processes to improve. Where do you even start? Surprisingly, the answer may come from an unusual placeflossing your teeth. Start Small to Build Better Habits In the book Tiny Habits by B.J. Fogg, the author suggests a simple way to build habits. If you want to start flossing your teeth, dont aim for a perfect routine right away.
Groundbreaking Study: Trusting Your Gut 78.45% More Effective Than Data-Driven Decisions, Say Top Execs April 1, 2025: CAMBRIDGE, MA In a shocking reversal of modern business orthodoxy, a joint study from Harvard Business School and Gartner has concluded that trusting your gut and doing what sounds good are 78.45% more effective than traditional data analytics, integration, or any attempt to be data-driven.
Announcing Actionable, Automated, & Agile Data Quality Scorecards Are you ready to unlock the power of influence to transform your organizations data qualityand become the hero your data deserves? Watch the previously recorded webinar unveiling our latest innovation: Data Quality Scorecards, powered by our AI-driven DataOps Data Quality TestGen software.
Unlocking Data Team Success: Are You Process-Centric or Data-Centric? Over the years of working with data analytics teams in large and small companies, we have been fortunate enough to observe hundreds of companies. We want to share our observations about data teams, how they work and think, and their challenges. We’ve identified two distinct types of data teams: process-centric and data-centric.
How Data Quality Leaders Can Gain Influence And Avoid The Tragedy of the Commons Data quality has long been essential for organizations striving for data-driven decision-making. Despite the best efforts of data teams, poor data quality remains a persistent challenge, leading to distrust in analytics, inefficiencies in operations, and costly errors. Many organizations struggle with incomplete, inconsistent, or outdated data, making it difficult to derive reliable insights.
Announcing DataOps Data Quality TestGen 3.0: Open-Source, Generative Data Quality Software. Now With Actionable, Automatic, Data Quality Dashboards Imagine a tool that can point at any dataset, learn from your data, screen for typical data quality issues, and then automatically generate and perform powerful tests, analyzing and scoring your data to pinpoint issues before they snowball.
No Python, No SQL Templates, No YAML: Why Your Open Source Data Quality Tool Should Generate 80% Of Your Data Quality Tests Automatically As a data engineer, ensuring data quality is both essential and overwhelming. The sheer volume of tables, the complexity of the data usage, and the volume of work make manual test writing an impossible task to get done.
The Gartner presentation, How Can You Leverage Technologies to Solve Data Quality Challenges? by Melody Chien, underscores the critical role of data quality in modern business operations. High-quality data is the blood that sustains the organizational value chainimpacting everything from logistics to services, sales, and marketing. Poor data quality, on average, costs organizations $12.9 million annually , or 7% of their total revenue.
A Drug Launch Case Study in the Amazing Efficiency of a Data Team Using DataOps How a Small Team Powered the Multi-Billion Dollar Acquisition of a Pharma Startup When launching a groundbreaking pharmaceutical product, the stakes and the rewards couldnt be higher. This blog dives into the remarkable journey of a data team that achieved unparalleled efficiency using DataOps principles and software that transformed their analytics and data teams into a hyper-efficient powerhouse.
Would you like help maintaining high-quality data across every layer of your Medallion Architecture? Like an Olympic athlete training for the gold, your data needs a continuous, iterative process to maintain peak performance. We covered how Data Quality Testing, Observability, and Scorecards turn data quality into a dynamic process, helping you build accuracy, consistency, and trust at each layerBronze, Silver, and Gold.
A Guide to the Six Types of Data Quality Dashboards Poor-quality data can derail operations, misguide strategies, and erode the trust of both customers and stakeholders. Data quality dashboards have emerged as indispensable tools, offering a clear window into the health of their data and enabling targeted actionable improvements. However, not all data quality dashboards are created equal.
The Race For Data Quality In A Medallion Architecture The Medallion architecture pattern is gaining traction among data teams. It is a layered approach to managing and transforming data. The Medallion architecture is a design pattern that helps data teams organize data processing and storage into three distinct layers, often called Bronze, Silver, and Gold.
“That should take two hours, not two months. Can’t your Data & Analytics Team go any faster?” “The executives’ dashboard broke! The data’s wrong! Can I ever trust our data?” If you’ve ever heard (or had) these complaints about speed-to-insight or data reliability, you should watch our webinar, DataOps for Beginners, on demand. DataKitchen’s VP Gil Benghiat breaks down what DataOps is (spoiler: it’s not just DevOps for data) and how DataOps can take your Data & Analytics factory fro
Read Our New White Paper: Data Quality The DataOps Way Data quality isn’t just a technical hurdle—it’s a strategic necessity in the data-driven world. Traditional methods fall short, but the DataOps approach to data quality offers a transformative path forward. It empowers individuals to act swiftly, enables continuous improvement, and fosters collaboration across organizational silos.
Christopher Bergh is the CEO and Head Chef at DataKitchen. Chris has more than 30 years of research, software engineering, data analytics, and executive management experience. At various points in his career, he has been a COO, CTO, VP, and Director of engineering. Enjoy the chat.
A DataOps Approach to Data Quality The Growing Complexity of Data Quality Data quality issues are widespread, affecting organizations across industries, from manufacturing to healthcare and financial services. According to DataKitchen’s 2024 market research, conducted with over three dozen data quality leaders, the complexity of data quality problems stems from the diverse nature of data sources, the increasing scale of data, and the fragmented nature of data systems.
From Cattle to Clarity: Visualizing Thousands of Data Pipelines with Violin Charts Most data teams work with a dozen or a hundred pipelines in production. What do you do when you have thousands of data pipelines in production? How do you understand what is happening to those pipelines? Is there a way that you can visualize what is happening in production quickly and easily?
Data Quality Circles: The Key to Elevating Data and Analytics Team Performance Introduction: The Pursuit of Quality in Data and Analytic Teams. According to a study by HFS Research, 75 percent of business executives do not have a high level of trust in their data. High-quality data underpins the reliability of insights, models’ accuracy, and decision-making processes’ efficacy.
DataOps, the promising future that nobody seems to be able to make reality. But not for lack of trying: meet Chris Bergh, "Head Chef" at DataKitchen, joining us again to tell us how te filed evolved over the last few years. To get in.
2024 Gartner Market Guide To DataOps We at DataKitchen are thrilled to see the publication of the Gartner Market Guide to DataOps, a milestone in the evolution of this critical software category. As the pioneer in the DataOps category, we are proud to have laid the groundwork for what has become an essential approach to managing data operations in today’s fast-paced business environment.
Summary
In this episode of the Data Engineering Podcast, host Tobias Macey welcomes back Chris Berg, CEO of DataKitchen, to discuss his ongoing mission to…
DataKitchen’s Data Quality TestGen found 18 potential data quality issues in a few minutes (including install time) on data.boston.gov building permit data! Imagine a free tool that you can point at any dataset and find actionable data quality issues immediately! It sure beats having your data consumers tell you about problems they find when you are trying to enjoy your weekend.
Christopher Bergh, CEO of DataKitchen, is transforming data analytics with his DataOps approach. By applying principles from agile and lean manufacturing, Bergh aims to eliminate the 70-80% waste in data processes. DataKitchen's suite of open-source tools offers solutions for observability, testing, and automation, addresses challenges in rapid change management, error detection team productivity.
Navigating the Storm: How Data Engineering Teams Can Overcome a Data Quality Crisis Ah, the data quality crisis. It’s that moment when your carefully crafted data pipelines start spewing out numbers that make as much sense as a cat trying to bark. You know you’re in trouble when the finance team uses your reports as modern art installations rather than decision-making tools.
Data Observability and Data Quality Testing Certification Series We are excited to invite you to a free four-part webinar series that will elevate your understanding and skills in Data Observation and Data Quality Testing. This series is crafted for professionals eager to deepen their knowledge and enhance their data management practices, whether you are a seasoned data engineer, a data quality manager, or just passionate about data.
Harnessing Data Observability Across Five Key Use Cases The ability to monitor, validate, and ensure data accuracy across its lifecycle is not just a luxury—it’s a necessity. Data observability extends beyond simple anomaly checking, offering deep insights into data health, dependencies, and the performance of data-intensive applications. This blog post introduces five critical use cases for data observability, each pivotal in maintaining the integrity and usability of data throughout its journe
The Five Use Cases in Data Observability: Ensuring Data Quality in New Data Sources (#1) Introduction to Data Evaluation in Data Observability Ensuring their quality and integrity before incorporating new data sources into production is paramount. Data evaluation serves as a safeguard, ensuring that only cleansed and reliable data makes its way into your systems, thus maintaining the overall health of your data ecosystem.
The Five Use Cases in Data Observability: Effective Data Anomaly Monitoring (#2) Introduction Ensuring the accuracy and timeliness of data ingestion is a cornerstone for maintaining the integrity of data systems. Data ingestion monitoring, a critical aspect of Data Observability, plays a pivotal role by providing continuous updates and ensuring high-quality data feeds into your systems.
The Five Use Cases in Data Observability: Mastering Data Production (#3) Introduction Managing the production phase of data analytics is a daunting challenge. Overseeing multi-tool, multi-dataset, and multi-hop data processes ensures high-quality outputs. This blog explores the third of five critical use cases for Data Observability and Quality Validation—data Production—highlighting how DataKitchen’s Open-Source Data Observability solutions empower organizations to manage this critical s
The Five Use Cases in Data Observability: Fast, Safe Development and Deployment (#4) Introduction The integrity and functionality of new code, tools, and configurations during the development and deployment stages are crucial. This blog post delves into the third critical use case for Data Observation and Data Quality Validation: development and Deployment.
DataKitchen Training And Certification Offerings For Individual contributors with a background in Data Analytics/Science/Engineering Overall Ideas and Principles of DataOps DataOps Cookbook (200 page book over 30,000 readers, free): DataOps Certificatio n (3 hours, online, free, signup online): DataOps Manifesto (over 30,000 signatures) One Day DataOps training (paid) Data Observability (the first step in DataOps) I deas and Principles of Data Observability Four-part Da
Introducing DataKitchen’s Open Source Data Observability Software Today, we announce that we have open-sourced two complete, feature-rich products that solve the data observability problem: DataOps Observervability and DataOps TestGen. With these two products, you will know if your pipelines are running without error and on time and can finally trust your data.
The hosted by Christopher Bergh with Gil Benghiat from DataKitchen covered a comprehensive range of topics centered around improving the performance and efficiency of data teams through Agile and DataOps methodologies. Gil Benghiat, co-founder of Data Kitchen, began by explaining the overarching goal of achieving data team excellence, which involves delivering business value quickly and with high quality.
We organize all of the trending information in your field so you don't have to. Join 42,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content