Managing data quality to optimize value extraction

Managing data quality to optimize value extraction

The Delta Perspective

By:
Damilola OjoPartner, Head of Datafarm, Analytics Enablement unit
Patrick TalengSenior Manager, Analytics practice


As it becomes more mainstream to make business decisions and drive business model innovations based on data insights, there is still the inherent lack of trust that remains for many business leaders who have experienced the “where did you get the data from” moment.

A major telco operator, for example, was unable to reconcile customer base numbers (as basic as it sounds). Research also shows that the business cost of poor data quality may be as high as 10-20% of an organisation’s revenues. Data Quality stands as the greatest challenge impacting value extraction from data: ahead of lack of skillsets or technology tools.

Exhibit 1: Data Quality is the greatest challenge impacting value extraction from data

Source: Kaggle, Delta Partners analysis

In the era of connected and smart devices, the impact of data quality issues can be more damaging for telco operators with potential legal implications beyond just potential revenue leakage or operational inefficiencies. Quality data must be a prerequisite for any data-driven initiative, otherwise the data may not be trusted and relevant. 

How are operators reacting?

From Delta Partners’ experience of working with over 30+ operators globally, we observe varying levels of maturity. In the past,  most telco operators have executed at least one data quality initiative and they have set up teams, appointed experts, procured different market leading tools etc. For some other telco operators, they are just at a loss as to where to start and how to tackle the daunting issue of data quality. 

Data quality management is not a one-time task, but an iterative and collaborative process that involves data custodians (from IT) and data stewards (from business units) continuously identifying, fixing and monitoring the systems and operational processes that generate and capture data to ensure the reliability of data for making business decisions. 

Only a limited number of operators have truly implemented a comprehensive data quality program and have managed to address the numerous challenges that arise when embarking on such a program:

  • Lack of sponsorship: Not enough education, vested interest or financial investment from senior executives
  • Lack of ownership: No single point of accountability for data leading to multiple grey areas in roles and responsibilities
  • Complex technology architecture: Multiple and fragmented data architecture and processes making it tedious to identify and fix sources of data quality issues 
  • Limited cross-functional collaboration: Lack of balance in the involvement of business and technical stakeholders. Initiatives are too technology/vendor driven with little to no business involvement

A sustained, deliberate, cross-functional and executive sponsored initiative is typically required to address all these issues. It is important to note that in this article, we do not cover all the enablers for data quality such as data governance or technology requirements, but instead, we will focus on how to execute a data quality improvement project.

Executing a data quality initiative

Leveraging its experience in data management and analytics in the TMT industries, Delta Partners employs a logical framework and applies it under a business-centric and pragmatic approach to ensure that tangible value is realized from the data quality initiative and ongoing mechanisms are put in place for the longer term.

Discovering data quality issues 

One efficient way to start is to focus on Critical Data Elements  (CDEs) linked to Key Performance Indicators (KPI), master data and key business processes. CDEs are the underlying data fields and variables stored in the source systems, data warehouse or data mart tables which are used to derive/calculate the KPIs.

A first set of relevant CDEs can be proposed and/or identified by business stakeholders and then profiled to generate statistics and information summaries that can enable the discovery and diagnosis of potential data quality issues. Data profiling is done using data quality tools that can generate descriptive and quantitative information about the structure, content and relationships of data fields in a table.

Based on the outputs of the data profiling, potential issues can be further examined and then prioritized by severity, complexity, business impact and an actionable remediation plan can be developed. 

Remediating data quality issues 

With the collaboration of data custodians and data stewards, a thorough root cause analysis of the data quality issues can be conducted. The required business and technical “know-how” can be leveraged to drill-down into ‘hot spots’ in systems, business and operational processes also to define structural solutions.

These solutions could be of varying nature and complexity - from quick fixes and tweaks to more structural, time-consuming interventions requiring significant investments. For example, it could include:

  • Education of front-end employees on data entry constraints and policies
  • Reloading of data warehouse tables or data marts
  • Update of Extract, Load and Transform (ETL) logic within the Enterprise Data Warehouse (EDW) data pipelines or development of new pipelines
  • Configuration changes to business application and systems
  • Implementation of data management solution like metadata management, Master Data Management (MDM), etc.
  • Total overhaul or replacement of systems of records

Nevertheless, it is important to drive through these required interventions and leverage a prioritized roadmap to ensure tangible benefits are realized early and the most critical issues are tackled first. 

Data quality monitoring

Once data quality issues have been fixed, it is important to measure the improvements and socialize them on a regular basis with business stakeholders to be able to restore trust in data. Therefore, efficient metrics should be put in place to continuously monitor the state of data quality. 

An overall Data Quality Index (DQI) which enables a comprehensive evaluation of an organization’s data quality based on best practices. It is a normalized metric expressed as a percentage score ranging from 0 to 100 where 0 is for the lowest quality and 100 is for the highest quality. That index provides operators with a proactive and reliable measure of their data quality (beyond perception) and can be leveraged to distil the right remediation initiatives that maximize positive business impact with an optimized investment. 

Delta Partners has experience building such data quality metric by computing and running a series of validation tests on CDE’s within operator’s data systems. Our methodology follows established best practices and assesses data quality across 6 main dimensions.

Exhibit 2: How it works: Dimensions for evaluating data quality are assessed and selected


The resulting data quality score for each CDE can be aggregated at the different levels e.g. system, data entity, business domain and corporate to produce an executive dashboard with drill-down capabilities.

Exhibit 3: Data Quality Index Dashboard – Summary Executive View


 

Conclusion

In a market where data insights increasingly drive competitive advantages, it becomes ever more important for operators to manage their data as an asset and properly address their data quality issues. Significant value is at stake and investing in a data quality program is imperative. For companies who have successfully implemented a data quality program, research shows that the potential business value generated includes: an increase of up to 35% in operating revenues, a decrease of down to 20% in operating expenses and an increase of up to 20% in labor productivity.


Damilola is a Partner within the Analytics practice and Head of Analytics Enablement for the Datafarm service line. He has more than 16 years of consulting and industry experience combined with hard technical skills in the area of data management and analytics. Damilola specializes in AI and Big data analytics, developing data management capabilities and delivering scalable solutions. He has a Bachelor of Science degree from the premier university in Nigeria and professional certifications from MIT and Columbia University.

Patrick is a Dubai-based Senior Manager. He has more than 14 years of experience in designing and leading the delivery of large Analytics & Data Management solutions. He has worked for several TMT organizations in Europe, the Middle East and Africa. His areas of expertise include Data Strategy, Data Governance, Data Quality, BI and Analytics. Patrick is a member of Delta Partners’ Analytics practice, he holds a dual MBA from HEC Paris & New York University Stern and a MSc in IT from ESIEA Paris.

If you would like to contact the author to further discuss this topic, you can email to:
do@deltapartnersgroup.com

Video

Datafarm