Challenges
Master Data organisation at Hagos. EG.
The master data of Hagos was partially redundant and stored in different systems, current maintenance processes were time-consuming and inefficient.
New master data including changes and updates were forwarded on in lists to the next processor via email attachments. This process was susceptible to errors, non-transparent and often led to issues in the databases and systems. Data changes were difficult to track as logging was not standardised nor automated. The distribution of data did not take place in real time, which led to delays in the maintenance process and posed the risk of utilising old data.
Solution
In order to provide the data for the different sub-systems in a suitable form, a central data model was designed.
A multi-stage supplementation and release workflow was developed to standardise the maintenance processes. Stakeholders defined key figures for the data quality criteria and corresponding measures were already integrated into the maintenance process.
The master data maintenance process is now workflow-supported in the system, whilst system breaks have been eradicated. To maintain consistently high data quality, evaluation options and data clearing functionalities have been introduced. Based on the requirements concept (specifications), a master data management tool was qualified and introduced.
Key Points
- The conception of central data management and the definition of the data generation and maintenance process lead to high data quality.
- Redundancies and inconsistencies have been eliminated.
- A system of key figures in the management report provides information on data quality.
- Downstream processes reliant on up-to-date master data were accelerated by approx. 15%.
- Cost reduction in data entry and correction with a 30% time saving.
- The investment was realised with a ROI of 2.5 years.