This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
Getting to great dataquality need not be a blood sport! This article aims to provide some practical insights gained from enterprise master dataquality projects undertaken within the past […].
A growing number of companies have leveraged big data to cut costs, improve customer engagement, have better compliance rates and earn solid brand reputations. The benefits of big data cannot be overstated. One study by Think With Google shows that marketing leaders are 130% as likely to have a documented datastrategy.
1 In this article, I will apply it to the topic of dataquality. I will do so by comparing two butterflies, each that represent a common use of dataquality: firstly and most commonly in situ for existing systems, and secondly for use […]. We know the phrase, “Beauty is in the eye of the beholder.”1
As someone deeply involved in shaping datastrategy, governance and analytics for organizations, Im constantly working on everything from defining data vision to building high-performing data teams. My work centers around enabling businesses to leverage data for better decision-making and driving impactful change.
Ensuring dataquality is an important aspect of data management and these days, DBAs are increasingly being called upon to deal with the quality of the data in their database systems more than ever before. The importance of qualitydata cannot be overstated.
Regardless of how accurate a data system is, it yields poor results if the quality of data is bad. As part of their datastrategy, a number of companies have begun to deploy machine learning solutions. In a recent study, AI and machine learning were named as the top data priorities for 2021, by 61% […].
Data is everywhere! But can you find the data you need? What can be done to ensure the quality of the data? How can you show the value of investing in data? Can you trust it when you get it? These are not new questions, but many people still do not know how to practically […].
It’s clear how these real-time data sources generate data streams that need new data and ML models for accurate decisions. Dataquality is crucial for real-time actions because decisions often can’t be taken back. About George Trujillo: George is principal data strategist at DataStax.
This article is the third in a series taking a deep dive on how to do a current state analysis on your data. This article focuses on data culture, what it is, why it is important, and what questions to ask to determine its current state. The first two articles focused on dataquality and data […].
And do you have the transparency and data observability built into your datastrategy to adequately support the AI teams building them? Will the new creative, diverse and scalable data pipelines you are building also incorporate the AI governance guardrails needed to manage and limit your organizational risk?
Reading Time: 11 minutes The post DataStrategies for Getting Greater Business Value from Distributed Data appeared first on Data Management Blog - Data Integration and Modern Data Management Articles, Analysis and Information.
ETL (Extract, Transform, Load) is a crucial process in the world of data analytics and business intelligence. In this article, we will explore the significance of ETL and how it plays a vital role in enabling effective decision making within businesses. Both approaches aim to improve dataquality and enable accurate analysis.
A 2015 paper by the World Economic Forum showed that big data might just be a fad. The article certainly raised a lot of controversy, considering the massive emphasis on the value of data technology. However, the article raised some very valid points. The article was not arguing that big data is going to go obsolete.
The first step to fixing any problem is to understand that problem—this is a significant point of failure when it comes to data. Most organizations agree that they have data issues, categorized as dataquality. However, this definition is […].
Data Accuracy is one of the so-called “dimensions” of DataQuality. The goal for these dimensions, and it is a noble one, is so we can measure each of them, and should deficiencies be found then there should be a uniform set of best practices that we can implement. Of course, these best practices will differ from […].
The content on A-Team Insight covers financial markets and the way in which technology and data management play a part. This site offers expert knowledge and articles geared towards decision-makers in investment management firms and investment banks. Techcopedia follows the latest trends in data and provides comprehensive tutorials.
The purpose of this article is to provide a model to conduct a self-assessment of your organization’s data environment when preparing to build your Data Governance program. Take the […].
This article is the fourth installment in a series taking a deep dive on how to do a Current State Analysis on your data. This article focuses on Data Outcomes: what they are, why they are important, and what questions to ask to determine the current state. The questions are organized by stakeholder group to […]
Donna Burbank is a Data Management Consultant and acts as the Managing Director at Global DataStrategy, Ltd. Her Twitter page is filled with interesting articles, webinars, reports, and current news surrounding data management. TDAN stands for The Data Administration Newsletter. It is published by Robert S.
I believe that my strongest articles and columns come from opportunities to work with great companies and organization. A long-time client recently told me that, for their data and […]. Of course, I cannot mention their names. But there is a strong possibility that you may have some of the same opportunities in front of you.
I raised the Cambridge Analytica Scandal and pointed out how it is often only when these stories hit the news that people question the ethics behind how companies are using data. Clearly, using private Facebook data collected in a nefarious manner to sway political elections is not ethical. What’s your datastrategy?
Whether you’re stepping into a new organization as a data lead or trying to overhaul your data infrastructure, the first step in the process is to understand how your organization currently uses data. This is the beginning of a series of articles […].
Background The success of a data-driven organization recognizes data as a key enabler to increase and sustain innovation. The goal of a data product is to solve the long-standing issue of data silos and dataquality. Mike is the author of two books and numerous articles. His Amazon author page
Layering technology on the overall data architecture introduces more complexity. Today, data architecture challenges and integration complexity impact the speed of innovation, dataquality, data security, data governance, and just about anything important around generating value from data.
Control of Data to ensure it is Fit-for-Purpose. This refers to a wide range of activities from Data Governance to Data Management to DataQuality improvement and indeed related concepts such as Master Data Management. DataStrategy.
The recently launched DataStrategy Review Service is just one example. White Papers can be based on themes arising from articles published here, they can feature findings from de novo research commissioned in the data arena, or they can be on a topic specifically requested by the client. Follow @peterjthomas.
One of the greatest contributions to the understanding of dataquality and dataquality management happened in the 1980s when Stuart Madnick and Rich Wang at MIT adapted the concept of Total Quality Management (TQM) from manufacturing to Information Systems reframing it as Total DataQuality Management (TDQM).
Data wellness obviously means a lot to me. This article starts with something that is “going on” with […]. David Crosby, of super group fame with Stills, Nash and Young, once said that you have to write about something that goes on in your life if you want to write something that means something to you.
In this blog, we will discuss a common problem for data warehouses that are designed to maintain dataquality and provide evidence of accuracy. Without verification, the data can’t be trusted. Enter the mundane, but necessary, task of data reconciliation. This is often a time-consuming and wasteful process.
Business has a fundamental problem with dataquality. In some places it’s merely painful, in others it’s nearly catastrophic. Why is the problem so pervasive? Why does it never seem to get fixed? I believe we’ve been thinking about the problem wrong. It’s time for a fresh look.
Reading Time: 3 minutes Last month, IDC announced that LeasePlan, a car-as-a-service company, was the winner of IDC’s European DataStrategy and Innovation awards, in the category of Data Management Excellence, for LeasePlan’s logical data fabric. This is a testament to the maturity of.
This article is not about Marketing professionals, it is about poorly researched journalism. Prelude… I recently came across an article in Marketing Week with the clickbait-worthy headline of Why the rise of the chief data officer will be short-lived (their choice of capitalisation).
Imagine what it would be like if your data was perfect. By perfect I mean fit for use and high quality. By perfect I mean that the people in your organization have confidence in the data to use it for effective decision making and to focus on building efficiency and effectiveness through data into your […].
You may already have a formal Data Governance program in place. Or … you are presently going through the process of trying to convince your Senior Leadership or stakeholders that a formal Data Governance program is necessary. Maybe you are going through the process of convincing the stakeholders that Data […].
No, this is not a mistyping of data literacy. Yes, like everyone, I am aware of and fully on-board with the growing movement to improve data literacy in the enterprise. What I want to talk about is Data Littering, which is something else entirely.
In this article, we will discuss the current state of AI in analytics, as well as the future of this burgeoning industry and how it can be applied to analytics to simplify and clarify results and to make analytics easier for businesses and business users to leverage.
With the vast array of data available and a competitive landscape driving ongoing tactical demands, focusing on developing a proper datastrategy is crucial, but can be a big task. On the one hand, the more data you have, the better. And sometimes it’s about deliberately doing less.
Back in 2017, I wrote an article titled There are No Facts … Without Data. The overwhelmingly positive response to that article validated for me that most people believed my premise to be true. It is time to revisit that topic. I was very thankful to see that. In this anti-fact world (watch cable news […].
In my journey as a data management professional, Ive come to believe that the road to becoming a truly data-centric organization is paved with more than just tools and policies its about creating a culture where data literacy and business literacy thrive.
Harnessing the power of data has become critical in today’s digital age when information is abundant and decision-making is critical in many aspects of business. Understanding your data may unearth hidden insights and move your business ahead, whether you’re a small startup or an established enterprise.
This article is the second in a series taking a deep dive on how to do a Current State Analysis on your data. see first article here) This article focusses on Data Freshness: what it is, why it’s important, and what questions to ask to determine its current state.
In the September issue of TDAN.com, Anthony Algmin denounced Data Catalogs as a “1980’s solution to a 2020’s problem.” What is the state of data science today? As I state in my book, The Data Catalog: Sherlock Holmes Sleuthing for Data Analytics and in many articles, 80% (or more) of a data analyst’s […].
We organize all of the trending information in your field so you don't have to. Join 42,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content