Ongoing Data Governance and Data Excellence in Experian’s Global Finance Services (GFS)

Add bookmark
Paul Rodwell
Paul Rodwell
09/23/2024

data governance

My previous articles outlined three key steps in the journey to enable Global Finance Services to successfully provide reliable, consistent, and meaningful data, reports, and business insights to the wider business  

Step 1: Establish a single, or at least, standardized ERP system and Global Chart of Accounts. 

Step 2: Centralize the management and maintenance of the enterprise data lake and critical master data.

Step 3: Create a central global BI team of experts to develop, maintain, and support standard reports.  

However, the journey doesn’t end there – as Lily Tomlin, American actress and comedian, put it:

“The road to success is always under construction".

Success in the implementation of the above will undoubtedly generate continued and growing demand for new data and reports from finance and, increasingly, from executive management and other areas of the business. Continued success will require maintaining strong governance and control over the ERP, data, and reports as interdependent components of an end-to-end integrated eco-system. It will also depend on sustaining the ability to integrate, maintain, and manage new finance, non-finance, and external data sources to satisfy stakeholder requirements.   

For Experian Global Finance Services, having successfully completed the first 3 steps, the journey continues with a focus on some key areas and initiatives …  

Data Governance and Data Lineage

Defining and enforcing standard processes for data ingestion and maintenance, detailed documentation of those processes, a comprehensive data dictionary (definitions – what each data item is and how it is used), and a data catalog (content – what is in the data lake, where it is held and what it is called) are critical to maintaining visibility, consistency and control over the data. As part of the data catalog, “data lineage” (mapping and understanding the flow of each element of data and how it relates to similar data elements from other sources) is important for effective maintenance and report building, but also facilitates support and troubleshooting. 

Data Ingestion Starts with Business Demand 

The content of the data lake, the ingestion of new data sources, and the development of new reports should be entirely governed and driven by the prioritized demands and requirements of the business.  
This involves close collaboration between 3 key teams in a 3-step process: 

  • Data Management team (Global Finance Services): business/data analysts working closely with the business users to understand, scope, and document the requirements, identify the data needed and the appropriate data source(s), and create the blueprint for new data ingestion, as needed. 
  • Data Integration team (IT): technical specialists working closely with the business/data analysts to develop the data integration and curation processes necessary to create the required data sets within the data lake.
  • BI Team (Global Finance Services): reporting and data visualization specialists developing the reports and analytic views of the data, integrating them into the Management Information Portal, for the business users to readily access for reports and insights.    

End-to-End Data Automation  

Automation of the data set-up and maintenance across multiple systems, including data ingestion into the data lake, and associated controls and reconciliation, helps to improve data accuracy, consistency, and completeness. It avoids duplication of effort, improves productivity, eliminates timing differences between data sets, and reduces the risk of human error or omission in data set-up and maintenance. 

For instance, end-to-end automation of:

New Customer Set Up

From initial creation in the sales systems (CRM) to the operational delivery system(s), billing and invoicing system(s), accounts receivable (AR) system (in the ERP), and any associated mapping between systems and insertion into the relevant reporting hierarchies, etc. Typically, this requires manual set-up in numerous different systems by different teams in a series of disconnected processes, often with no end-to-end oversight, potentially resulting in validation errors in the interfaces between systems (eg: the customer is not set up in AR to process billing/invoice transactions), causing delays and manual corrections/adjustments. 

Or 

New Global Chart of Account (GCoA)

Segment Values from end-user requests via approval workflow to automated set-up in the GCoA and relevant reporting hierarchies. Historically, this also depended on manual processes and email-based approvals which extended the timeline between request and availability for use, often across a month-end close.   

Data Health 

It is important to develop and implement appropriate mechanisms – such as “source to target reconciliations” and automated alerts built into data ingestion processes - to monitor and maintain the integrity, accuracy, and completeness of the data in the data lake and reports and also to mitigate the potential impact of any inevitable changes in the systems, data or processes.  

It is also important to provide clear visibility of the status of the data – date completeness (reconciled) and data freshness (last updated) - in order for users to know the point in time that the data reported represents and to have confidence in any decisions being made thereon

Ideally, given the sheer number of data ingestions, this should be automated and presented via an easy-to-read Data Health Dashboard made available via the Management Information Portal or within the respective reports. 

New Data Sets & Opportunities  

There will inevitably be continued demand for new reports and insights that go beyond the core financial data that was the primary/initial focus of the enterprise data lake and reporting program.  New acquisitions will also present both challenges and opportunities to expand the enterprise data lake and global reporting scope. These will drive the need to ingest extended or new/additional data sets – such as HR/employee, operational and sales, and external (eg: economic data such as inflation, FX rates etc) data - to enrich the enterprise data resource and create the opportunity to provide more valuable information and insights to the wider business and executive management. 

Security  

A key consideration, not only for sensitive or proprietary financial and commercial data but also for employee or PII data, that is subject to external regulatory restrictions. This requires careful consideration in the design of the data lake and reporting access and security models and, as new data is ingested into the enterprise data lake, continuous review and assessment of the security and compliance risks. 

GenAI / AI / Machine Learning 

Emerging and evolving GenAI, AI, and Machine Learning technologies and capabilities offer new opportunities to enhance the reporting and generation of insights as well as the way data is ingested and managed and the way users interact with the data. Opportunities such as “talk to the data” (chatbot style interrogation) and “predictive analytics” (eg: forecasting) offer significant opportunities to drive even greater value from the investment in creating a robust ecosystem for data and reporting. They do not undermine the importance of standardized ERP, data management, and reporting capabilities – these are foundational – nor do they diminish the value of the enterprise data lake as a “single source of truth” and platform for delivering even greater and more valuable business insights. 

As mentioned, these are all part of the ongoing and ever-evolving journey to continually provide the business with relevant, reliable, and actionable reports and insights, and Global Finance Services remains uniquely placed to provide that service to Global Finance and the wider business.  


RECOMMENDED