Guide

Data Governance and SSOT: Master Data Management Guide

Koray Çetintaş 10 February 2026 7 min read

Most companies have a “customer list” maintained in three different places: in the CRM, in the accounting software, and in the sales team’s Excel spreadsheet. Each contains different phone numbers, different discount rates, different addresses. Which data is correct? Nobody knows. This chaos is a classic symptom of a lack of data governance.

In this guide, you will find answers to questions such as: What is data governance? How do you establish a SSOT (Single Source of Truth)? What are the dimensions of data quality? How is master data managed? How is data ownership defined?



What is Data Governance?

Data Governance Dashboard

Data governance defines who manages the organization’s data and how.

Data governance is the set of policies and processes that define how data in an organization is created, stored, shared, and protected, and by whom, within what rules.

Data governance is not solely an IT responsibility. Business units, legal, finance, operations, and senior management share joint responsibility. This is because data accuracy, security, and usage impact the strategic decisions of the entire organization.

The 4 Pillars of Data Governance

1. Data Ownership
A responsible person or unit is designated for each data set. For example, customer data is the responsibility of sales, and product data is the responsibility of product management.

2. Data Standards
Rules are established for how data should be named, formatted, and entered. For example: phone numbers must be entered in the format (0532-XXX-XX-XX).

3. Data Quality
The accuracy, completeness, consistency, and timeliness of data are continuously measured and improved.

4. Data Security
Rules are defined regarding which users can access which data, which data can be shared externally, and how personal data is protected.


The SSOT (Single Source of Truth) Concept

Single Source of Truth Concept

SSOT: A single authoritative source for every piece of data

SSOT (Single Source of Truth) is the principle that there should be a single authoritative source for any given piece of data within an organization. Other systems reference this source, do not keep a copy, or synchronize their copies if they do.

SSOT Principles

1. Single source for each data set
Customer data → CRM
Product data → PLM / ERP
Stock quantity → WMS / ERP
Price list → ERP
Supplier data → SRM / ERP

2. Other systems reference the source
An e-commerce site retrieves the customer address from the CRM via API; it does not store it in its own database. This ensures all systems are updated simultaneously when a customer’s address changes.

3. Synchronization is mandatory if copies exist
In some cases, data copies are kept for performance or offline work needs. In such situations, a real-time or scheduled synchronization mechanism is established.

Benefits of SSOT

  • Eliminates data inconsistency: The sales and finance teams see the same discount rate.
  • Increases trust: Reports are fed from a single source, making results indisputable.
  • Reduces data update costs: Updates are made from a single location.
  • Simplifies auditing: Data history is tracked in one place.
  • Speeds up decision-making: Time is not wasted asking “Which data is correct?”

The Cost of Data Inconsistency

Problems of Data Inconsistency

Data inconsistency incurs operational and strategic costs.

In companies lacking data governance, data inconsistency directly creates costs in daily operations and strategic decisions:

Operational Costs

Duplicate record cleanup: The same customer is registered in 3 different systems with 8 different variations. Manual cleanup takes hours.

Data validation time: The sales team requests discount approval from the finance team. Finance doesn’t trust the discount in the CRM and checks their own Excel. Extra communication, extra time for every transaction.

Incorrect shipments: A customer’s address was updated in the e-commerce system but not synchronized with the logistics software. The product is shipped to the wrong address. Return costs, customer dissatisfaction.

Report preparation time: Preparing the monthly sales report takes 2 days because data needs to be collected from 4 different sources, matched, and inconsistencies resolved.

Strategic Costs

Inaccurate forecasts: Supply planning is done incorrectly due to inconsistent stock quantities. This leads to either excess inventory costs or lost sales due to stockouts.

Loss of trust in reports: Senior management is unsure about the accuracy of reports. The question “Where does this number come from?” is asked in every presentation. The decision-making process slows down.

Deteriorated customer experience: A customer calls the call center and is directed to their old address. They call again, complain. Brand value is damaged.

Compliance risks: During a GDPR audit, it’s unknown which system contains which customer data. If a customer requests data deletion, it needs to be deleted from 5 different systems, but it might be overlooked.


Master Data Management

Master Data Management

Master data defines the core assets of an organization.

Master Data refers to data that defines an organization’s core assets and changes infrequently. Transactional data, on the other hand, refers to the frequent operations performed on these assets.

Master Data Categories

1. Customer
Customer name, tax ID, address, contact information, payment terms, price group, sales representative.

2. Product
Product code, name, category, unit, cost, price, supplier, technical specifications, description.

3. Supplier
Supplier name, tax ID, payment terms, delivery times, quality rating, risk level.

4. Location
Warehouse, branch, facility information. Address, capacity, authorized person, region, active/inactive status.

5. Employee
Employee code, name, department, title, permissions, cost center, manager.

Principles of Master Data Management

Centralized definition: Master data is defined in a single system (usually the ERP or an MDM solution). Other systems are fed from this source.

Approval process: An approval process is followed before defining new customers, products, or suppliers. For example, finance approval is required to add a new customer.

Standard format: A format is determined for each field. Phone: (05XX) XXX-XX-XX, Tax ID: 10-digit numerical.

Regular cleanup: Master data cleanup is performed at least once a year. Duplicate or unused records are deleted or deactivated.

Version control: Master data changes are logged. It is tracked who changed what and when. If necessary, it can be reverted to a previous version.


Dimensions of Data Quality: Accuracy, Completeness, Consistency, Timeliness

Dimensions of Data Quality

Data quality is measured and tracked across 4 dimensions.

Data quality is more comprehensive than just asking “Is the data correct?”. It is evaluated across 4 dimensions:

1. Accuracy

Definition: How well the data aligns with reality.

Example: Is the customer’s phone number, 0532-XXX-XX-XX, recorded the same way in the system? Or is it an old number?

Measurement: 100 customer phone numbers are randomly dialed. The number of correct and incorrect ones is checked. Accuracy rate = number of correct records / total number of records.

Target: 95%+ accuracy rate.

2. Completeness

Definition: The extent to which mandatory fields are filled.

Example: Tax ID, address, and phone number are mandatory for customer records. Out of 1000 customers, 850 have a tax ID, and 150 do not. Completeness rate = 85%.

Measurement: The completeness of mandatory fields is queried automatically. Measured separately for each data set.

Target: 100% completeness for critical fields, 90%+ for others.

3. Consistency

Definition: The same piece of data having the same value across different systems.

Example: Customer XYZ’s discount rate is 20% in CRM, 15% in ERP, and 18% on the e-commerce site. There is inconsistency.

Measurement: The same data set is retrieved from different systems and compared. Consistency rate = number of matching records / total number of records.

Target: 98%+ consistency rate. With SSOT implementation, inconsistency approaches 100%.

4. Timeliness

Definition: How up-to-date the data is, and the delay time.

Example: A customer’s address changed. It was updated in the CRM within 5 minutes, but updated in the logistics system after 2 days. There is a 47-hour delay in the logistics system.

Measurement: The difference between the time of change and the time of synchronization is measured. Average delay time is calculated.

Target: Real-time synchronization for critical data, <1 hour delay for others.


Data Ownership and Responsibility Matrix

The most critical element of data governance is the clear definition of ownership for each data set. Data ownership resides with the relevant business unit, not IT. IT only provides the technical infrastructure.

Data Ownership Example Matrix

Data Set Data Owner Data Steward Technical Owner
Customer Master Data Sales Director CRM Manager IT
Product Master Data Product Manager Product Specialist IT
Supplier Master Data Purchasing Manager Purchasing Specialist IT
Price List Finance Director Pricing Specialist IT
Stock Quantity Logistics Manager Warehouse Manager IT
Employee Data HR Manager HR Specialist IT

Roles and Responsibilities

Data Owner: Makes decisions regarding data quality, standards, and access rights. Defines business rules. Example: “Customer tax ID is mandatory.”.

Data Steward: Performs daily data quality checks. Identifies and corrects errors, or ensures they are corrected. Provides data entry training.

Technical Owner (IT): Responsible for databases, integrations, backups, security, and performance. Technically implements business rules but does not define them.


Real-World Case Study: Multi-Channel Retail Company

Real Case (Unbranded)

Retail Store

Situation

A medium-sized retail company with 12 branches in Turkey, an e-commerce site, and a B2B order platform. Customer data was maintained in 4 different places: the store POS system, the e-commerce platform, the B2B portal, and the finance ERP. The same customer was registered with 4 different codes and different discount rates.

Problem

  • Customers saw different prices when ordering from a store and the website on the same day.
  • Loyalty points were only valid in-store and not online.
  • The finance team manually consolidated data from 4 sources to perform customer risk analysis (a 3-hour task).
  • Customers received campaign SMS messages twice (from different records).

Steps Taken

  1. Month 1: Data inventory was conducted. 28,000 customer records across 4 systems were analyzed. 18% were identified as duplicates.
  2. Month 2: The CRM system was designated as the SSOT. Customer master data standards were defined (format, mandatory fields, approval process).
  3. Month 3: Duplicate records were cleaned up, and the 4 systems were integrated with the CRM. POS, e-commerce, and B2B systems began retrieving customer information from the CRM.
  4. Months 4–6: A data ownership matrix was created. The Sales Director became the owner of customer data. Weekly data quality reports were initiated.

Result (Representative)

  • Customer data consistency: 64% → 97%
  • Duplicate record rate: 18% → 1.2%
  • Risk analysis preparation time: 3 hours → 10 minutes
  • Campaign SMS duplication: 12% → 0.5%
  • Loyalty points became usable across all channels.

7 Data Governance Mistakes

1. Viewing Data Governance as an IT Project

Data governance is not an IT responsibility but a business unit responsibility. IT only provides the infrastructure. Business units define the business rules.

2. Ignoring SSOT

The approach of “Each system keeps its own data, we’ll match them when needed.” Result: Constant inconsistency, loss of trust, manual matching costs.

3. Not Measuring Data Quality

The assumption that “data is generally correct.” Accuracy, completeness, consistency, and timeliness are never measured. Problems are only noticed when they become significant.

4. Not Defining Data Ownership

The approach of “Everyone can modify any data.” Result: Ambiguity of responsibility, no quality control, no one to hold accountable for errors.

5. Neglecting Master Data

Customer, product, and supplier data are not managed centrally. Each department maintains its own list. Synchronization is impossible due to lack of integration.

6. Not Enforcing Data Entry Rules

Phone numbers are entered as “0532…” in one place, “+90 532…” in another, and “532…” elsewhere. There is no format standard, leading to low data quality.

7. Not Performing Regular Cleanup

Duplicate records, unused records, and old data remain in the system for years. Data pollution increases, and query performance degrades.

Data Management

Proper data governance processes prevent mistakes.


Data Quality KPI Table

Key metrics used to measure the success of data governance:

Metric Baseline Target Measurement Method
Data Accuracy Rate 73% >95% Random sampling, manual check
Data Completeness Rate (Mandatory Fields) 65% 100% Automated query (count of empty fields)
Data Consistency Rate (Across Systems) 58% >98% Cross-system comparison query
Data Synchronization Delay 24 hours <15 minutes Timestamp analysis
Duplicate Record Rate 15% <2% Fuzzy matching algorithm
Master Data Update Time 48 hours <2 hours Approval process logging
Data Quality Incidents (Monthly) 120 <15 Incident tracking system
Data Access Control Compliance Rate 55% 100% Authorization matrix audit

Data Governance Checklist

Items to check when establishing a data governance program:

Policy and Organization

  • Is the data governance policy documented?
  • Is there senior management sponsorship?
  • Has a data governance committee been established?
  • Has a data owner been assigned for each data set?
  • Have data stewards been identified?

Data Standardization

  • Have master data categories been defined (customer, product, supplier…)?
  • Has the SSOT been determined for each data set?
  • Have data entry formats been standardized?
  • Have mandatory fields been defined?
  • Has a data dictionary been created?

Data Quality

  • Are data quality dimensions (accuracy, completeness, consistency, timeliness) being measured?
  • Have data quality KPIs been defined?
  • Are periodic data quality reports being generated?
  • Has a data cleansing process been defined?
  • Is there a mechanism for detecting and deleting duplicate records?

Technical Infrastructure

  • Has data integration between systems been established?
  • Is the data synchronization mechanism active?
  • Have data access permissions been defined?
  • Is data change logging (audit trail) active?
  • Is there a data backup and restore procedure?

Process and Improvement

  • Has the master data approval process been defined?
  • Are data quality incidents being tracked?
  • Are regular data quality meetings being held (monthly)?
  • Are data governance training sessions being provided?
  • Is a continuous improvement mechanism active?

Frequently Asked Questions (FAQ)

SSOT (Single Source of Truth) is the principle that there should be a single authoritative source for any given piece of data within an organization. For example, customer addresses are maintained only in the CRM, and stock quantities only in the ERP. Other systems reference these sources but do not create copies. This prevents data inconsistency, increases trust, and reduces update costs.

Data inconsistency (the same data differing across systems), making decisions based on incorrect information, loss of trust in reports, duplicate records, operational time loss (manual data matching, validation), incorrect shipments/invoicing, and audit and compliance risks occur. For instance, the sales team might see a 20% discount in the CRM, while finance sees 15% in Excel.

Master data rarely changes and defines the core assets of an organization (customers, products, locations, suppliers). Transactional data, on the other hand, is the result of frequent business processes (orders, invoices, payments, stock movements). Master data is managed centrally following the SSOT principle; transactional data is distributed across operations. Example: “ABC Company” is master data; “ABC Company’s order dated 01/15/2026” is transactional data.

Data quality is measured across 4 dimensions: Accuracy (alignment with reality, checked by random sampling), Completeness (filling of mandatory fields, automated query), Consistency (alignment across systems, comparison query), and Timeliness (synchronization delay, timestamp analysis). Targets are set for each dimension (e.g., 95%+ accuracy, 100% completeness), and periodic measurements are taken.

Data ownership should reside with the relevant business unit, not IT. For example, customer data is the responsibility of sales, product data of product management, and supplier data of purchasing. IT only provides the technical infrastructure; business units define the data rules (format, mandatory fields, approval process). The Data Owner makes decisions, and the Data Steward performs daily checks.

Yes, but the scale is simplified. For a company of 10-50 people, a simple Excel Master Data template, data entry rules, and weekly quality checks are sufficient instead of a heavy MDM (Master Data Management) solution. The key is to apply the principles: define SSOT, assign data ownership, and perform regular cleanups. In large companies, this requires a full-time team; in small companies, it can be a part-time responsibility.


About the Author

Koray Cetintas is an advisor specializing in digital transformation, ERP architecture, process engineering, and strategic technology leadership. He applies a "Strategy + People + Technology" approach shaped by hands-on experience in AI, IoT ecosystems, and industrial automation.

Get Support for Your Project

I can help guide your digital transformation initiative. Book a free preliminary call to discuss your priorities.