Consulting
Reading time

Planning data integration - 5 steps to success

datenintegration-planen_header.jpg

Our Data Integration Tool combines data quality and data integration functions into a comprehensive solution for data migration, data warehousing, master data management and data quality projects.


Uniserv Data Integration offers you numerous connectors for direct read and write access to databases, applications and files, as well as a user-friendly graphical editor for defining your data flows, transformation functions and connectors to the proven Uniserv data quality tools. You can integrate the whole thing into your processes simply by drag & drop. This makes it easy for you to implement complex processes in your projects in the areas of data migration and data integration as well as data quality.
 

Automated data integration workflows through integrated workbench


The central component of the Data Integration Tool is an integrated workbench, which ideally supports the user in defining automated processes in the area of data management. With the help of the graphical user interface:
 

  • extract the descriptive metadata from the source and target systems
  • use drag & drop to insert and link the individual processing steps in the flow
  • define transformations of the data structures and formats wherever necessary
  • store the flow definition in the central repository
  • test and analyze your flow definition, eliminate any errors that may occur and optimize the flow
  • finally bring the job to execution


Process descriptions created with the graphical user interface of the Data Integration Tool can be executed manually (ad hoc) or automatically via the integrated scheduling functionality at any desired time, recurring (e.g. once a week or once a month) or depending on an event (e.g. copying a file to directory). The job can also be exported as an executable JAR file and the execution can thus be easily integrated into your own IT-supported process systems.
 

Flexible use of the Data Integration Tool


Of course, the Data Integration Tool is also suitable for local use at a workstation as well as for company-wide handling of your data management tasks. In the client/server variant, jobs registered with the scheduler can be executed on any server in your company network on which a runtime environment for the Data Integration Tool is installed. The integrated load balancer automatically distributes new jobs to free server nodes and restarts the corresponding job on another server if a server node fails.

With the Data Integration Tool, we provide you with an contemporary system that helps you to comprehensively organize your data management processes.
 

Services of the Uniserv Data Integration Tool


The following features make the Data Integration Tool an indispensable tool in all your projects related to data warehousing, data migration and Master Data Management
 

  • user-friendly graphical editor for creating flow descriptions
  • a central repository where your flow descriptions are versioned and securely available for access by multiple users Access components for a wide range of databases, applications and file formats
  • easy integration of the created jobs into existing automated processes
  • a wide range of data quality and transformation functions that can be directly integrated into your process descriptions via drag & drop
  • a rich collection of control elements for the design of complex processes
  • high performance execution through generation of Java code
  • integrated load balancing and failover mechanisms in the client/server variant

The 5 phases of data integration


The consolidation of all relevant data in a central data warehouse or another business application offers the possibility to gain important insights for successful corporate management. In addition, the integration of all decisive information on a business partner is a guarantee for effective and targeted work in the operative business. We will show you in five steps how integration can be successfully realized in the customer master data area.
 

1. Initialization


In a rough project plan, it is determined where exactly and by whom the data to be integrated will later be used. In addition, the roles of the stakeholders are defined and an initial overview of the data to be integrated and the data management systems is created with the involved business units. 
 

2. Concept


This is where the detailed procedure during the various project phases is defined. For each of the data systems to be integrated, a profile of the data volume, technical specifications, and content-related statements about the data should be created and data owners should be named. In addition, initial data quality criteria are developed that must be met so that the data integration can be successful.
 

3. Data Cleansing Stream


The extent to which the data quality of the source systems meets the requirements of the business is checked. For this purpose, the data is extracted from the source systems and qualitatively prepared and temporarily stored in an initial staging area.
 

4. Integration Stream


At the latest at the beginning of the integration stream, it must be clear what the design of the target application looks like and exactly what information is to be reproduced. All source systems must be harmonized with each other, structure and value mappings must be created, and the complex transformation rules must be specified. The goal: the "single view of customer".
 

5. Build, Test, Go Live


In the technical implementation of data integration, the data is read from the source systems, loaded into a staging area, and adapted to the qualitative requirements of the business areas using various transformation modules. Finally, the data will be transferred to the target system.
 

Monitoring


To ensure that all phases of data integration can be monitored, it is advisable to set up monitoring. The monitoring covers the following areas: data extraction, compliance with data quality, transformation and matching rules, and data loading into the target application.

 

Data Quality Firewall – the sustainable data quality assurance


To ensure high data quality in the continuous process of data integration, it is also advisable to implement a data quality firewall at the source system, the "point of entry".

 

Conclusion


A successfully implemented integration provides a uniform, high-quality database that enables companies to analyze data quickly in order to react to market trends in a timely manner, for instance.

First Time Right – High data quality from the beginning

 

Customer data is a valuable asset. A lot is invested in bringing it up to date. And then? Often nothing happens. Poor quality data spreads more and more throughout the company and hinders the entire business operation.

Implement your data quality firewall - up-to-dateness, completeness and accuracy right from data entry. Download the Uniserv white paper on First Time Right now free of charge. Perfect, high-quality data straight away!


Download the paper now

Share this article: