VNSG Blog

Maximise SAP S/4HANA value creation and support the AI revolution with Clean Master Data

Geschreven door VNSG | 23-5-2024

Master Data Management (MDM) has evolved from being a unifier of basic business information to increasingly becoming a critical differentiator by enabling transparency, agility, and sound decision-making in this dynamic business landscape. It allows a single, trusted view of a business by offering a common reference platform for key attributes of customers, suppliers, product, material, location sites, pricing amongst others. Business and IT need to work in tandem to not only ensure uniform standards and accountability but also robust governance and stewardship of this core data of an enterprise.


SAP’s Master Data Governance (MDG) application helps to pull master data together and manage it centrally using a master data management layer. In a very engaging discussion with Gerd Danner, VP EMEA COE for SAP Business Technology Platform at SAP with 20 years’ experience in data management, we explore the evolution and criticality of master data, especially in the context of an SAP S/4HANA transformation.

Gerd is responsible for EMEA solution advisory, demand generation, customer adoption and support for SAP Business Technology Platform (SAP BTP) with a focus on data management and master data management. One of his priority areas is ‘Rise with BTP’ which aims to help customers strategically combine their SAP S/4HANA transformation journey with SAP BTP. Gerd passionately urges enterprises to reap the full benefits of an S/4 transformation by treating this as an opportunity to get their master data in order. From his extensive experience in this field, he brings to life several practical examples and business scenarios to highlight the importance of master data as a foundational game changer.

 

The earned value of Master Data Managment lies in driving agility via real-time business intelligence

 

Data is the starting point. Every four out of ten transformation initiatives fail due to absence of a robust data management strategy. Recalling his long history with master data since 2008, Gerd says ‘the importance of data management in today's world is probably bigger than it’s ever been... that's where my passion comes from and what drives me on.’ He goes on to quote an enterprise architect who talked about a not-so-distant future where business services will be modular and available on demand in the cloud. This would potentially allow enterprises to freely compose end-to-end business processes without really caring about the provider of those services. Now, this vision will work only if you have aligned and harmonized master data.

Gerd goes on to illustrate the value and criticality of this very basic but often overlooked area by drawing up instances from various parts of the business and translating this into tangible business value. He cites an example of a manufacturing company where the Chief Procurement Officer faced a problem of excess inventories in some factories and out-of-stock situations in others. Moreover, purchasing seemed to be ineffective as they were spending too much on certain spare parts and spare part categories. The hypothesis was that data was one of the underlying root causes and they needed help to evaluate the situation and make a business case for action. The first step was to understand how the end-to-end procure-to-pay business process works. It starts with a supplier self-registry, followed by supplier qualification, contract negotiation, buying and finally longer-term supplier evaluation. For this business process, SAP Ariba needs to talk about the same supplier as the SAP S/4HANA system. ‘While this sounds simple, in reality it is super complex’ says Gerd. On exploring further, they found inadequate linkages between the 2 platforms. This resulted in a severe inability to negotiate future contracts due to lack of information on historical spending with a certain supplier. This ended up with enormous maverick spending - a business case of 12 million euro savings, part of which would be recurrent over the years.

In another example, a company found it had multiple cases of duplicate and obsolete spare parts in the ERP, implying that they were certainly present in the data warehouse as well as in the physical warehouse. This directly meant too much of blocked capital in their supply chain.

On the finance side, similarly it can be extremely challenging if an enterprise runs their operational finance on a different platform than the consolidation platform, especially in a very diverse heterogeneous organization. Central finance only works efficiently if all the legacy data in the operational finance systems is mapped to a common language. If not, manual data reconciliations of reporting data that feed into the consolidated financial report might easily take hundreds of man-days each closing period. With robust data management, this non-value adding effort can be eliminated and the time to close the books can be drastically reduced.

Analytics is another relevant example to showcase the increasing value of good master data. The biggest value that S/4HANA brings over erstwhile ERPs is all its embedded intelligence - analytics, reporting, and intelligent recommendations leveraging embedded AI and machine learning. Gerd brings home the point with a situation handling case of a goods receipt-invoice receipt reconciliation where ideally the number of goods received on the loading bay should match with that stated in the invoice. In case of a difference, some decisions must be made, and appropriate actions initiated. Here embedded reporting can only be effective if there is high quality data, as opposed to earlier days where ERP data was extracted and loaded onto a data warehouse and fixed or validated via business rules in nightly batch jobs to produce a high-quality report. Now with embedded reporting available within the S/4HANA system, the report needs to be real-time and first time right.

Finally on the customer side, businesses face challenges of multiple codes for the same customer’s name. Gerd emphatically quotes ‘in every SAP system I've seen, the same customer is sitting in there 5, 10, 20 times and in an extreme case I saw it was 100 times! Now try to do customer credit risk management with this sort of data…difficult.’ Some of this duplication can be attributed to the way SAP was set up, where a customer can have only one address. To mitigate this, partner functions were introduced enabling the same customer to have a ‘sell to’, ‘bill to’ or ‘ship to’ address. So, in essence, all these partner functions have an individual customer code in ERP, though they are the same customer. A comprehensive data management system will take care of these issues and thus is an essential gateway to clean data and visibility for efficient planning and decision-making.

SAP S/4HANA transition - a once in a lifetime opportunity to get your Data Management in order

The longer enterprises wait to get master data right, the more difficult it gets, especially for those with a longer ERP history. Gerd cites a story of a customer with an ERP history of 15-20 years who had implemented only three material types on their material master. While 20 years ago this was sufficient to understand their inventory positions, today since material valuation and hence the worth of the inventory in the finance system is directly linked to the material type in SAP, it no longer creates accurate results. As a solution, they wanted to introduce new material types and reassign existing materials to these. Given his experience, Gerd warns ‘again while this sounds simple, it is impossible…you cannot realistically change certain things in a live production ERP’. Though SAP offers services like Data Management Landscape Transformation (DMLT) to address these types of situations, it is often like open heart surgery with a partial narcosis…the patient is not really asleep and you are doing crazy stuff in his body’.

He goes on to firmly reiterate ‘the move to S/4HANA is a once in a lifetime opportunity from a data perspective. Treat it like this’. He further urges companies to avoid seemingly quick and easy ways like in-place Brownfield conversions where the data stays as-is and just gets adjusted to the new SAP S/4HANA tables. He advises ‘the minimum from a data management perspective should be a Bluefield approach to SAP S/4HANA, where and an empty copy of the system is created followed by carefully selected data transfer, including a thorough data cleansing and harmonizing activity. But in most cases, it is probably better to go Greenfield’. Prima facie, while this is a difficult choice for customers as they have already invested a lot of time in the current data, if one looks at it from an end-to-end cost of ownership, Gerd confidently argues a positive return on that incremental investment post 3 years.

Evolution of Master Data Governance (MDG), SAP’s flagship product

Today, the cost of failure, be it process, analytic or even AI failure due to the absence of clean data is extremely high. To address this, Gerd promotes a comprehensive approach ‘you need to look at your data governance framework, define your strategy, set up your organization and your processes and get your tooling in place’. Zooming in on the tooling, MDG continues to be SAP’s flagship product. It has evolved over the last 15 years from laying the foundations of central data governance to now providing also master data consolidation and data quality management capabilities. Gerd goes on to further highlight these capabilities with simple business examples.

Firstly, MDG is now more efficient, enabling the creation of master data as a workflow as opposed to being created centrally in the ERP and then sent across the technology landscape. This not only enables multiple and relevant people to enter attributes, but also offers automation, derivation, and auditing possibilities.

Secondly, MDG’s consolidation capabilities are very valuable especially when there is a need to infuse external data into the master data landscape. One typical used case is a consumer company selling its products not only directly but also via channel partners. Here they would need to load the data of new customers provided by these partners on a regular basis. Another case is from material master, where there is an external file upload of a catalog of thousands of items that can be potentially ordered from a supplier. In such cases, MDG’s consolidation capabilities prevent creation of duplicates ensuring clean rationalized data at source.

Thirdly, the data quality management capabilities of MDG help to create data quality dashboards, based on defined validations and business rules in the tool. It helps to identify which records don't fit to the standards and trigger quality remediation measures via the workflow.

Ultimately, SAP MDG is moving to public cloud, as there is a strong push towards cloud by customers. In parallel, a BTP version of MDG, called the MDG Cloud edition is already available and further developed which is a subset of the capabilities of MDG that run on S/4HANA in private cloud or on premise. And most recently, with MDG Cloud Ready Mode launched in Q4 2023, SAP is bringing together these different existing code lines for private or public cloud into a common code line. This implies a customer will eventually have an option of selecting MDG private or public cloud based on the type of cloud they are on.

In addition to these base MDG capabilities, SAP partners have developed extension solutions (sold via SAP Store) for MDG. For example, to cater to the special needs of customers for governance of reference data, Itego has built business content on top of MDG. To explain further Gerd goes on to outline 3 layers of data. At the bottom is the reference data, that doesn't normally change much, like country codes, currency codes, payment terms etc. It is configured once during S/4 implementation, and then only periodically checked, probably once a year. The second layer is master data that comprises materials, customers, suppliers etc. that changes more often. The top layer is the transactions data like the sales or purchasing orders, which changes most frequently. However, there is a stated need to fix the reference data. He distinctly recalls a customer operating across 13 ERP systems, with every ERP system having different country codes. Here Itego’s reference data management solutions would be very useful. In other cases where the data quality issues concern an object requiring lesser governance, SAP partners with Mendix to offer a faster no code, low code solution.

Another capability is CDQ First Time Right, a solution that connects to publicly available reference registers such as tax office or chamber of commerce data, as well as commercial data providers to accelerate customer or vendor creation in SAP MDG while improving productivity and data quality, based on pre-build interfaces to SAP MDG.

Adopting a transformational approach to Data Management

Organizations can deploy frameworks like Gartner's 7 building blocks of Data Governance Initiative to improve data management and governance. Further SAP has collaborated with the Business Engineering Institute, a spin-off of the university St Gallen to develop best practices and approaches from a project methodology perspective including the organization, data strategy, processes, tooling, architecture for data and data integration. One approach is to start with a maturity assessment to analyze where the organization stands on those key dimensions compared to industry peers, and it is a good indicator of where they should start investing.

However, one of the biggest challenges for starting is to get the business case approved. Data quality issues usually surface at the IT department as help desk inquiries if something breaks down. Citing his experience Gerd says, ‘in 99.9% of cases, it is IT coming to us and the biggest problem is if IT thinks they can solve it themselves without the business involved, then they are most likely going to fail’. Good data management and data quality rules require business involvement, change management needs business buy in so identification of a business sponsor is imperative. Gerd points out ‘usually horizontal functions like purchasing or finance are a good place to start, where you need to make sense of all the data that is coming from your organization. And that is also where you very soon realize that your data is not that good. If you can build a compelling value story, CFOs and CPOs get it’.

Another way to identify issues is the business process analyzing tools that identify the variants and deviations from the process, that often happen due to bad master data. SAP’s Signavio process insights and business scenario recommendation report helps customers take a decision on SAP S/4HANA showing the value potential of the investment. To begin, it analyzes the process and compares the performance with industry benchmarks on key metrics and their root causes e.g. time to close books is directly impacted by efficiency of account payables or account receivables. A deep dive quite often shows poor quality data as a root cause. Further, MDG QuickStart can be used to analyze the quality of data from different sources and dimensions to get both a qualitative and quantitative assessment.

A smart way to garner business support and sponsorship is to leverage the S/4 transformation as an opportunity to accelerate the master data and data management strategy agenda. ‘With the business sitting at the table, use that attention to also drive the data management goals with linkage to success of the transformation journey’ advises Gerd.

Generally, customers request to move quickly to SAP S/4HANA keeping the cost low and with minimal disruption to business. Essentially, it’s often a dilemma of a quick technical move of ‘lift and shift’ followed by incremental improvements of data vs. a comprehensive transformational approach that starts out with clean data at source and how key business processes run.

Pragmatically, the guidance is to have a parallel two track approach. On one track, strategize and narrow down a business problem to be solved and set up the organization accordingly. On the second track, get a turnkey MDG system (e.g. MDG QuickStart) into the landscape where in a quick three weeks a fully functional MDG with a standard configuration is in place, but not yet customized to specific needs. Basically, a basic sandbox for testing, playing around and figuring out how things really work in practice before initiating a data migration.

The cost of speedy but incorrect data migration will keep increasing exponentially and making it more difficult to correct in the future. Due to time and business constraints, if it is not possible to correct the data at source during the migration, at a minimum Gerd strongly recommends using master data consolidation or the master data quality dashboards to improve data quality for analytics of downstream processes to fix the issues after the fact.

The future of Master Data Governance powered by Generative AI

Talking about the future and what is coming up with the advent of AI, Gerd is quick to point out that use of AI in MDG is not new. SAP has deployed AI in data quality management to suggest sensible business rules for MDG. However, on generative AI, Gerd says ‘we are at the beginning of one of the biggest revolutions in SAP, this is going to massively change how you work and interact with SAP Systems’. New use cases with MDG are being explored and planned for implementation in the next 12-15 months - essentially around faster creation and updating of rules, faster data entry and automation of record creation. A few examples include SAP’s generative AI powered digital assistant Joule working together with MDG, automatically populating and validating a data entry form for creating a new customer by just typing in a customer name and address by a salesperson and combining data quality validations in SAP MDG with Signavio Process Insights reports to produce a business case for an SAP MDG investment. Gerd concludes ‘that’s a key reason why I think data, especially in S/4HANA transformations is of utmost importance…and with the whole AI revolution that we're seeing ahead of us we will also soon notice that what an AI engine generates or predicts is only as good as the data that sits in the system.’

Tips and information sources for Master Data

Being such a wide topic, Gerd advises to start at the SAP website and zoom into SAP MDG. Further, there is an active community with blogs and practical information for users. Moreover, user groups like the VNSG, DSAG, ACG have MDG interest sub-groups where there are active discussions and resources shared. And finally, external organizations like business Engineering Institute of University of St. Gallen run competency centers around data management. Being European based, all the members of this centre are SAP customers, and most of them use MDG. Hence, this is a good platform for exchanging experiences and learnings with peers.

_____________________________________________________________________

  • Interviewee: Gerd Danner, VP EMEA COE for SAP Business Technology Platform, SAP Germany (SAP Deutschland SE & Co. KG)
  • Interviewer: Frans van Hoesel, VNSG
  • Images: Daniela Gaschler, VNSG
  • Article: Radhika Gupta, Business and Digital Transformation Advisor & Writer

_____________________________________________________________________

Join VNSG Focus Group Data Management and elevate your Master Data Governance to the next level!

The VNSG Data Management (DM) focus group aims to enhance Data Management within organizations by concentrating on three main areas: Data Quality, Data Management, and Master Data Governance. The group provides a platform for VNSG-members to elevate their Data Management practices through:

  • Exchanging ideas and information
  • Sharing and enriching knowledge
  • Offering best practices from external parties
  • Support through networking
  • Acting as a liaison and sparring partner with SAP (Walldorf)

Professionals involved in creating, managing, consolidating, and distributing data are welcome to join this focus group. This includes those from user organizations, architects, data managers, information managers, and functional application managers.