Business Intelligence (B.I.), Digitalisation and Digital Transformation are just some of the buzzwords so commonly used today, sometimes it is difficult to interpret and understand both the contents and meanings of such terms.
First of all, Business Intelligence (B.I.) is a term from computer science, or more precisely business informatics, and describes procedures and processes for the systematic analysis of data in a company. Business intelligence is a process that analyses technology-driven data, prepares it and presents it to management in presentations or so-called dashboards as a basis for decision-making.
Business Intelligence usually includes a variety of tools that enable companies to collect data from their I.T. systems or external sources. Queries are defined from requirements for management decisions, data sources are "tapped", so to speak, and processed into reports, dashboards and data visualisations. These data visualisations can include financial information up to transaction data from a value chain. Process key figures, so-called key performance indicators (KPIs) as well as data from competition or market timing evaluations are of course also included.
From our practical experience in consulting on the conception of business intelligence systems and the creation of specifications for BI selection, we repeatedly find that we are basically given 3 reasons for a business intelligence system:
In many companies, the classic I.T. structures have developed over time into complex, often very different systems. These different corporate I.T. landscapes, from the classic ERP system to CRM systems and logistics software, make it difficult to implement a consistent application for management reporting due to the large number of interfaces.
The range and variety of a B.I. system can consist of paper-based reports and lists created in the classic way from legacy systems. Additions of modern cockpits and the use of modern, specific data analysis to projection and simulation tools round off the range of applications.
We also repeatedly experience the use of specialised evaluation tools on a PC basis at the individual staff level. Often these systems are based on the ideas of trainees and students. Small databases and spreadsheets are often used for this purpose. It goes without saying that it is difficult to check the quality of the calculations and the consistency of the data, especially with larger evaluations of this kind. It is self-evident that this cannot represent the future, and above all that it cannot represent the connections to ERP systems or product life cycle systems. At this point, those responsible in companies immediately steer towards a professional orientation in order to maintain or restore competitiveness.
Management decisions based on data are fundamental advantages that BI systems make possible.
Of course, the quality of the data sources is crucial. The poorer the source data material, the poorer or less reliable the evaluation results, even if they are - as is often the case - nice to look at. Therefore, the qualification of master data or the qualification of business processes is part of every business intelligence initiative or even a simple presentation in a management dashboard. The topic of data quality is a decisive lever whether your investment in a BI tool pays off or not. Most of the time, data sources are also the proprietary ERP systems and/or article databases, whose contents must first be qualified.
A clear, functional and technical architecture of the I.T. landscape is a mandatory prerequisite for a B.I. system that can be used throughout the company and obtains the necessary user acceptance.
In a heterogeneously grown I.T. landscape, total costs for Business Intelligence BI are usually higher than in a more homogeneous I.T. landscape, because coordination efforts, interface programming and interface monitoring are lower.
High data quality is the prerequisite for trust in the results of business intelligence software. Here, too, we repeatedly find that heterogeneous I.T. structures tend to be responsible for lower data quality.
However, data quality is of course not solely dependent on the infrastructure and a heterogeneous I.T. landscape, but also on the people involved who either enter, monitor or define this data. We therefore always try to avoid system breaks (i.e. the transfer of data from one system to another I.T. system by people). Likewise, if data is entered manually, it must be checked by the system for semantics and correctness during data entry. This can prevent a system from receiving, processing and reissuing data that was not correct to begin with. Turning incorrect, non-correct data back into correct data in a system is usually a rather costly project.
In our experience, lack of transparency and individual, manual procedures for data evaluation lead to inefficiency and are cost-intensive. That is why we recommend that all our customers set up a set of rules for data quality, a so-called BI governance, in which the procedure for data generation, its further processing and its storage is regulated and thus comprehensible. Once this has been created, it only remains to implement it throughout the company.
What are the reasons for introducing a Business Intelligence B.I. system?