Ataccama
  • Platform
    Enterprise Data Quality Fabric
    Enterprise Data Quality Fabric
    Arrow right
    How It Works
    Platform Overview
    Arrow right
    Data Quality
    Data Quality

    Automated DQ checks, monitoring, anomaly detection, and remediation

    Reference Data Management
    Reference Data Management

    Centralized RDM, authoring, hierarchies, and synchronization

    Master Data Management
    Master Data Management

    Multidomain mastering, stewardship, AI matching, flexible data providing

    Data Integration
    Data Integration

    Flexible data extraction, transformation, and providing

    Data Catalog
    Data Catalog

    Automated data discovery, business glossary, and data marketplace

    Data Stories
    Data Stories

    Tell engaging data stories with your data

    Deployment
    Deployment Options Platform as a Service On Prem & Hybrid Architecture & Integrations
  • Solutions
    Back
    Focused on
    Implementing Data Governance

    A tool stack for starting fast and sustaining data governance

    Data Fabric

    Activate metadata and automate data mapping, extraction, and providing

    Big Data Management

    Ingest reliable data, govern your data lake, and process data.

    Single View of Data

    Establish a single source of the truth and create a single view for everyone.

    See all industry solutions
    From
    Banking, Insurance, Finance

    Data validation on input, Customer 360, regulatory compliance

    Healthcare

    Patient 360, reliable data for tests and EHRs, HIPAA compliance

    Retail

    Data validation on input, Customer 360, data enrichment, reference data

    Government

    Citizen 360, data sharing and protection, smart cities

    Life sciences

    Product MDM, clean data for clinical studies, spend transparency

    Telecom

    Customer 360, data enrichment, equipment tracking, data privacy

    Transportation

    Equipment monitoring, Customer 360, reference data, data privacy

    Latest read
    Ataccama Receives $150 Million Growth Investment from Bain Capital
    Ataccama Receives $150 Million Growth Investment from Bain Capital

    Ataccama receives a $150 million growth investment from Bain Capital Tech Opportunities to enhance R&D and go-to-market, and enable data democratization.

  • Customers
  • Company
    Back
    Contact us
    Schedule a call Contact us Sign-up for newsletter Live chat
    Company
    About us

    All about us, who we are, vision, leadership, offices

    Media kit

    Download our brand assets, photos, and product screenshots

    Careers

    #NotYourAverageJob

  • Resources 1
    Back
    Resources

    Videos, articles, tips from our experts & thought leaders

    News Success Stories Blog Whitepapers Webinars Demos
    All resources
    Support

    Get answers to your technical questions

    Documentation Training Knowledge base User community Customer support
    Events

    Attend our live virtual and in-person events, coming up

    Future of Financial Services, Melbourne 2022

    Jul 20

    Innovate VIC 2022

    Jul 21

    Hand picked for you
    title
    What Is Data Quality and Why Is It Important?

    Learn what data quality is, why it is important, what costs and risks bad data carries, and how you can get started with data quality today for free.

  • Partners
    Back
    Partners
    Become a partner

    Get to know our partnership model, join us

    Ataccama Partner Portal

    Login to our partner portal to access all essential tools and resources.

    Register Opportunity

    Register the lead and get a partner reward

    Our partners

    See our technological partners, system integrators and delivery partners

  • Try now
    Back
    Meeting
    Book a meeting

    Discuss your needs and requirements with one of our sales representatives.

    Free tools
    Web profiling

    One-click profiling in your browser. Just drag and drop a file.

    Data Quality Analyzer

    Advanced profiling tool. Install in minutes on Windows.

    Data Stories

    Modern data visualization. Present complex facts and wow all stakeholders.

    See all free tools
  • Contact
Ataccama
Login
User
Log In or register
Contact
Logo with rockets
Announcing
$150 Million Growth Investment
BainCapital logo
Learn more
Blog

The Cost of Poor Data Quality

9 minutes read

Did you know that the quality of crude oil depends more on its refining than where you found it?. Whether it’s from the oil-rich fields of Tapis, Malaysia, or its painstakingly extracted Athabasca oil sands of Canada, oil refiners build their infrastructure around the characteristics of the crude oil closest to them, resulting in similar qualities for their final product. Therefore, the raw product isn’t nearly as important as the processing method.

As data solidifies its status as the most profitable commodity in the world, the quality of your data (and how you process it) becomes all the more relevant. Similar to oil, raw data is certainly valuable. Still, the systems you have to refine it, maintain its quality, and distribute it to relevant parties are what extract the real business value.

There are many ways to measure that value. Specialists estimate that businesses with low data quality maturity can lose around $3.1 trillion in the U.S. alone – or 20% of their revenue. But why is the cost of poor data quality so great? We’d like to explore the financial side of data quality and answer that very question.

Poor data quality wastes everyone’s time

Unmanaged data is a massive time-waster in every department: from core data people, like data scientists and engineers, to end data consumers like the salespeople. According to Gartner, data quality affects overall labor productivity by up to 20%.

Data scientists spend about 60% of their time verifying, cleaning up, correcting, or even wholly scrapping and reworking data, or, as they like to call it, “data wrangling,” “data munging,” or “data janitor work.” They also spend about 19% of their time hunting and chasing the information they need. In the end, fewer machine learning models are tested and deployed, as machine learning and data quality go hand in hand.

Therefore, one of the biggest pain points for any data-driven company is getting the right data to the right people in a workable format.

This is a common problem among companies because, as they expand, their data gets recopied and distributed to different applications, systems, and departments. As all this change and distribution occurs, critical business data can become inconsistent, and no one knows which application or system is the most up to date. For example, a salesperson trying to call potential customers can waste up to 27% of their time due to contact details that are incorrect or incomplete.

With higher data quality, you can deliver data ready to be used on arrival. Decision-makers can respond to issues and market changes in real-time instead of waiting for their data to become verified and actionable. Also, high data quality will make it easier to automate data integration, so DQ isn’t lost when exchanging data between applications and systems as your company grows.

Bad data leads to bad analytics

Since analyzing data is one of the primary ways to extract value from data, it’s essential to have those analytics be as accurate as possible. After all, poor data quality leads to poor quality business decisions and failed projects/initiatives. Businesses have prioritized using data to drive their decision-making. After the 2008 financial the global market saw a 30% increase in the use of business metrics for decision making. Now, firms base around 45% of their business decisions on the data they’re collecting.

If you’re basing these decisions on dirty data, it can lead to unexpected costs and failures for your company – like wasting your marketing budget. Marketers waste 21 cents of every media dollar because of poor data quality. In other words, one-fifth of the media budget has 0 ROI. More specifically, missing or invalid customer contact information can cost an average large-sized company up to $238,000 per campaign. Not having a single customer view or master record can run from $85,000 to $425,000 per campaign.

IT modernization projects fail because of poor data quality

As business processes become more automated, data quality is the main roadblock between a successful initiative and a failure. More automation means that you will be building machine learning and self-sufficient applications on the data your company has available. Implementing these systems with poor data quality will inevitably lead to setbacks and failures.

The same is true for IT and data modernization projects, like migration to the cloud.

88% of all data integration projects fail entirely or significantly overrun their budgets because of poor data quality.

 33% of organizations have delayed or canceled new IT systems for the same reason.

This problem with IT systems is alarming when you acknowledge the importance of data infrastructure modernization for the contemporary business: 70% of productivity growth comes from IT projects. Poor data quality will impact about 10% of the savings you get from IT initiatives, meaning if you save $21.7 million in labor productivity, $2.2 million will be lost.

Risk and noncompliance

A significant financial risk for data quality comes from compliance and regulatory reporting. Suppose you are storing personal data that falls under the jurisdiction of the GDPR or other data protection laws. In that case, you need to maintain a certain level of data quality concerning accuracy and integrity. If anyone requests their personal data from your company, you will need to provide it immediately at an acceptable quality for them to understand it. This also means you’ll need to keep master records of all your private data.

You’ll also have to ensure all data under GDPR jurisdiction is validated upon entry, and use, as 70% of the contact data may become outdated annually as people change jobs, addresses, etc. Companies whose data doesn’t meet these standards run the risk of severe fines like 20 million euros or 4% of annual turnover (whichever is higher) under the GDPR.

In regulated industries like pharmaceuticals or banking, you also risk bad data skewing your reports, leading to fines for delivering false information. Basel II, a section of the Basel banking accords for banking supervision, necessitates “a framework to measure and manage data integrity” and “quantified and documented targets and robust processes to test the accuracy of data.” In pharmaceuticals, you will have to work with lots of personal data, especially for drug trials, forcing you to comply with the GDPR and similar regulations for proper storage and protection.

Here are other relevant financial figures related to non-compliant data:


Instead, you can just pay the average cost of staying compliant: $5.5 Million. It’s much cheaper to pay for compliance upfront than await the consequences later.

Poor data quality ruins customer experience

There are several ways that poor data quality can ruin your customer experience and have potential buyers running for the hills. Remember that customer data quality typically degenerates at 2% monthly and 25% annually. Gartner attributes the number one cause of CRM system failure to poor data quality.

The data you collect about your customers, their buying habits, and how they interact with your products and platform can all help you improve your user experience. Tailoring your business to customers will make them feel more confident in their preferences and get to the items they want to buy faster. However, if you try to build your customer experience around insufficient data, it will have the opposite effect. Individuals will receive offers they aren’t interested in, sometimes twice or three times if a customer duplicate exists in your database.

For example, let’s say your organization or institution works primarily on donations as a university does. You don’t want to resend offers to donors or send them the wrong offer and make them feel undervalued.

Better data quality will also prevent common retail problems that can turn off customers, like overpaying for a product or extra slow delivery times. For example, having low-quality geolocation data from your delivery drivers could lead them to pick longer routes, deliver items to the wrong warehouse, etc.

Other losses can come from your collections department in the hundreds of thousands due to missing or invalid customer data and collateral data that isn’t linked to a contract and therefore can’t be reevaluated. You can see the same for online retail, where companies can lose millions from scanning the wrong price of items, inventory-data inaccuracies, and phantom “stock-outs.”

View your data quality as an asset

Forrester says that data performance management is essential to proving the ROI of data. Data is now a business driver, resource, and economic stimulator. Business intelligence capabilities like dashboards and financial statements provide insight into the business’s health. As a primary resource, data also needs these capabilities to see if it is productive, where it can be improved, and if it meets expectations. Therefore, having high-quality data can be looked at as an asset.

Organizations that continue to make false assumptions about their data quality will only continue to experience inefficiencies, excessive costs, compliance risks, and customer satisfaction issues:

  • Customers can share negative experiences and write reviews about them.
  • Stakeholders can lose faith in data-based business decisions.
  • Employees can begin to question the validity of the data they’re working with, forcing them to double and triple check before using it.

Here is the takeaway:

Data quality impacts your reputation.

While the value of data quality management is apparent, many companies are still failing to take action.

Just look at the numbers:

Less than 50% of companies are confident in their internal data quality, and only 15% are confident in the data provided by third parties.

Despite this, companies still overestimate their overall DQ performance and underestimate the costs of errors in data, spending massive amounts of time fire-fighting immediate problems instead of investing in long-term solutions.

That’s why we recommend getting out in front of the problem. Business processes, customer expectations, source systems, and compliance rules change constantly. Your DQ initiative should reflect this by staying consistently updated with the data climate. Maintaining high data quality will give you some of the most valuable data on the market.

Right now, Gartner recommends maintaining a 3.5 sigma value (standard deviation of a data point from the mean) for your data or 22,800 defects per million data points. At this rate, quality initiatives will cost about 20% of the total cost of a business process. For example, if you have a marketing process that costs $100 million to run yearly, maintaining quality for that process would be about $20 million. Keep in mind that this cost could be significantly reduced with new innovations in data quality automation. It’s not cheap, but it’s worth it.

If you want to get started with data quality, you can begin here.

Don’t lose money! Get ahead of poor data quality

How does the average business lose $9.7 million annually because of poor data quality? It’s a combination of lost productivity, lousy analytics, project failures, not valuing data quality as an asset, the cost of non-compliance, and ruined customer relationships. If you want to save yourself millions, you should invest in a data quality management initiative as soon as possible.

You can start with Ataccama. Ataccama ONE Data Quality Suite is a self-driving data quality solution. It lets you understand the state of your data, validate & improve it, prevent bad data from entering your systems, and continuously monitor data quality.

By using metadata and AI, Ataccama ONE simplifies solution configuration, deployment, and scaling. Both small teams and enterprise-wide deployment with complex data landscapes see quick ROI and fast time to production. We make data quality accessible to anyone.

Get started with data quality with Ataccama

Ataccama ONE powers enterprise data quality management solutions at T-Mobile, Varo Money, Daiichi Sankyo, Raiffeisen Bank, Fiserv, and many others.

Discuss your needs with us

Related articles

What Is Data Quality and Why Is It Important?

What Is Data Quality and Why Is It Important?

Blog
How to Get Started with Data Quality: The 3 Steps You Should Take First

How to Get Started with Data Quality: The 3 Steps You Should Take First

Blog
The Evolution and Future of Data Quality

The Evolution and Future of Data Quality

Blog
Privacy Policy Cookie Policy Terms of Use Ethics Hotline
English
English Deutsche Pусский Français Espanol
© Ataccama 2022
Cookies We value your privacy

We use cookies on our website to enhance your browsing experience. By using our website, you consent to the use of cookies. To understand more how we use cookies or how to change your preference and browser settings, please see our privacy policy.

Select cookies