Master data has now turned into company capital. Its high value is acknowledged by the company, and there a wish to anchor data quality in the company strategy in the form of a Key Performance Indicator. A challenge is to measure and evaluate the data quality, and to monitor it continuously.
But how is the quality evaluated in concrete terms? How can areas be identified where there is a need for optimization? The key: the Data Quality Scorecard from Uniserv. It provides information about just how well the company data really supports its analytic and operative processes, and other data-driven projects. Company master data is regularly checked for compliance with individually fixed business rules. The result of this examination: a Data Quality Score. Based upon this score, business-rule based optimization measures are individually decided and carried out. This enables negative trends in the data quality to be identified at an early stage. Counter-measures can then be quickly implemented – before any damage occurs.
The Uniserv Data Quality Scorecard is a fundamental aid for enabling your company to perform successful data quality management. It delivers KPIs for regularly measuring the quality of your data against pre-defined business rules over a given period of time.
As opposed to simple monitoring at dataset level, the data quality scorecard measures the entire data stocks with individual business rules. The test results are aggregated at different levels and can be weighed with individual rules. Finally, the Data Quality Score is presented, the KPI which can be used for evaluating the quality level of the entire company master data stocks.
But the data quality scorecard offers more. Its drilldown functionality makes it possible to see exactly what rule, or entity, has led to a weak data quality score – and where problems caused by weak points are concealed. This enables targeted measures for optimizing data quality to be established. The effects of the measures are reflected in the next DQ scorecard run.
Therefore, the decision about whether optimization measures are worthwhile is the responsibility of the business supported by IT. The question as to where, and whether, the use of software or other measures is necessary, can be decided on an objective basis. Data quality will become transparent and controllable.