top of page
Search

OPTIMIZE your data governance program!

(This section is part of the GOVERN framework based on Morgan Templar’s “A Culture of Governance.” GOVERN videos and blogs can be also be found on the First CDO Partners LinkedIn page.)


Dalle-E 4-21-2024, prompt by Morgan Templar

 

The launch of AI-for-All that came with ChatGPT™ has made Data mainstream. This excitement provides an ideal growth environment to Optimize your data governance program.  Boards and C-Suites are more aware than ever of the need for the foundations of governance and quality to enable future AI/ML activities.

 

Optimizing our programs includes an evaluation of tools and business processes. While buying a new tool may be required, no tool on its own will solve the data issues. Business processes and protocols must be evaluated and modified, if necessary. A new tool purchase makes us review and agree on the approach that best fits the organization’s timelines, asking questions such as:

·         When will we be ready to use cutting edge tools?

·         How much value will be gained by improving the business processes?

 

A serious examination of your current program is necessary to plan for this new AI-enabled future. Begin with a comprehensive Gap Analysis. There are tools to help you with this, or you could hire a company like First CDO Partners to provide an external perspective. Regardless of method, it’s important to start with a goal in mind. Ask yourself these questions:

 

·      What value does this bring to the business in achieving their goals?

·      Who from the business will sponsor the optimization program? What Executive, CDO or other, will be accountable for the program aligned with the organizations’ goals?

·      Does the organization have the appetite to optimize the entire Data Management function? Or would sticking to Data Governance be more appropriate?

·      Is the desire to optimize the Data program in response to other transformation efforts? If so, do you have a list of requirements from those other programs?

·      Is there a long-standing Issue or Defect that needs to be corrected that hasn’t ever been funded?

 

 

“Optimize your Data Program” could mean very different things depending on the desired outcome. Wherever you are headed, advanced AI-driven automation or upgrading your data governance program with a suite of tools, start by looking at basic operations of the data team.

 

Get back to the basics of data governance. The Data Governance program must have authority and responsibility to ensure that the data in the organization is maintained, organized, and available to use for business operations and as the basis for strategy and innovation.

 

Everything that you optimize needs to serve the business purpose of better operations and improved transactions and should be the basis of strategic planning and improved understanding of information to drive innovation. That’s really it – improve the bottom line or reduce risk. This call-to-action drives Optimization.

 

 

Data Governance Tools

Data governance can be performed, to a limited degree, without any specific tools. The Microsoft or Google suite of tools can be used to get you started. But even using Excel or Sheets requires an understanding of what you are documenting and how it must be captured.

 

Let’s review the common data governance and data management tools. You may have one or two of these already in place. But a fully optimized data program will have all these features covered and working together to create a secure and useful data environment.

 

1.   Data Catalog: A metadata (data about data) management tool that provides a centralized inventory of an organization's data assets, including their definitions, locations, owners, rules and access controls. It enables data discovery and understanding.

 

2.   Business Glossary: A repository of business terms and definitions used across the organization. It ensures consistent understanding and usage of data terminology. It begins with Key Data Elements and grows to include any terminology deemed appropriate.

3.   Data Lineage: Data lineage tools map the flow of data from its source systems through various transformations and processes, enabling traceability and impact analysis. It is a basis for modeling knowledge graphs and ontologies.

 

4.   Metadata Management: Metadata management tools capture, store, and maintain metadata (data about data) to enable data understanding, integration, and governance.

 

5.   Data Quality Management: These tools assist the data quality team to assess data quality, identify issues, and enable data cleansing and standardization to improve data accuracy and consistency. Many of these tools offer automation of certain activities. The ability to monitor and report on Data Quality KPIs is necessary. These tools and processes rely on but may not include a standard Reference Data library.

 

6.   Reference Data Library: A Reference Data Library is a centralized repository that stores and manages codes and definitions of specific data sets, which are the consistent and uniform sets of identifiers and attributes used to classify and structure other data within an organization. The Reference Data Library enables organizations to maintain a single source of truth for reference data, ensuring data consistency, quality, and accessibility across the enterprise. Reference data, such as clinical diagnosis codes or classification of products in a supply chain operation, must be standards that are consistent and agreed upon. They may be ISO codes, ICD-10 codes, or other industry standard values. 

 

7.   Policy Management: Policy management tools allow organizations to define, implement, and enforce data governance policies and rules across the data landscape. Data Lineage tools should include Policies and their relationship to data elements. Policies should support regulations and business strategies.

 

8.   Access Control and Data Masking: These tools manage data access privileges and mask sensitive data to ensure data security and compliance with regulations. They may be the provenance of Cybersecurity or Information Security & Risk Management (ISRM). The capabilities of these tools often exists in organizations particularly around masking or de-identifying data. Close examination of most of these manual processes reveals inconsistent application of the rules and standards. Obfuscation, another synonym, rarely meets the requirement of removing the connection to the sensitive data – it can be reverse engineered revealing PII or PHI.

 

9.   Workflow and Collaboration: Workflow tools facilitate the coordination and collaboration among data stewards, owners, and stakeholders in data governance processes. Purchasing a Data Governance tool provides this toolbox for the data team to perform their duties effectively and efficiently across the entire life cycle of data across the enterprise.

 

10.Reporting and Dashboards: Reporting and dashboard tools provide visibility into data governance metrics, key performance indicators (KPIs), and the overall health of the data governance program. The enterprise data reporting and data visualization tools should be used for these functions.

 

11.Data Modeling: Data modeling tools help in designing, visualizing, and documenting the structure and relationships of data entities, enabling better data governance and management. Technical Data Modeling is necessary on the physical systems to ensure data engineering and others understand the relationships between data. Business Logical Data Modeling is needed that shows the relationship between data in business terms. This capability may already exist in one of the previous tools.

 

These tools work together to establish a comprehensive data governance framework, enabling organizations to manage, secure, and derive value from their data assets effectively.

 

Once a good data toolset is in place we can move to our next responsibilities:

•      Create a data culture through data literacy training and change management processes.

•      Nurture your current workforce and provide them with tools to upskill in both hard skills (e.g. Python) and soft skills (e.g. Communication).

•      Improve Innovation by putting their collective knowledge trust toward solving your big problems.

•      Scalability is important to ensure that AI initiatives can accommodate increased data and user demands. Flexibility to scale requires very well-defined guardrails.

•      Partnering with Risk Management, Security, Internal Audit, and IT Governance is crucial to create a Data Control Environment that is specific to your industry and level of maturity.

 

A fully optimized data program is a journey with no end point. New technologies and regulations to control them are a never-ending stream. Optimize should be part of your culture – the organization’s culture, not just your team. As our friends at the University of Rochester say, “Ever Better” is a great motto to follow!

5 views0 comments

Comments


bottom of page