A Brief History of the Future of Business Intelligence

Business intelligence is twenty years old. Software publisher Business Objects helped democratize BI in the 1990s – before it was acquired by German software giant SAP in 2007. Since then, many technology players have worked to rejuvenate BI by renaming it Data Discovery (Qliktech), Data Exploration (SAS), Analytics (Tableau, IBM, Oracle), Data Stories (Jolicharts), etc.

Nevertheless, BI, which is still full of innovation potential, is undergoing a major transformation in the form as we know it. Indeed, in the past it was mostly descriptive: that is, by relying on the corpus of data they were associated with, the BI tools came to clarify the data from the sample assigned to them by grouping them together. logical and synthetic. For a big use: reporting. However, the colossal market (with a hundred billion ladle) of descriptive analysis is today threatened with obsolescence by a wave of a new kind: that of the actors who rely on the winning triptych:

of a essential cloud computing turned utility such as electricity or highways, where the revolution here is that start-ups have unlimited computing power from their inception. These startups are inventing the manipulation of massive amounts of data, previously accessible only to behemoths too bony to innovate;

– The movement of Open Data or rather “the exposure of third-party data”, which goes beyond Open Gov to reach businesses and even civil society. Together they are rediscovering the concept of ecosystem through the provision of microservices that make data available to third parties, whether or not for a fee: this is the API economy and social networks, or rather the GAFAs, are eager to play a leading role;

– Artificial intelligencewho revisits 30-year-old statistical algorithms, such as random tree forests or neural networks, based on the computing capabilities of the cloud that have become affordable, but also on this famous third-party data, and finally on the realization of the Cartesian goal to world, because by placing sensors everywhere (smartphones, tablets, IoT,…) we allow the machine to learn faster, or to continuously improve its relevance.

Hadoop minotaur dismantled

We can say without pretense that the editing of this triptych defines what Big Data is. And when we finally managed to dismantle the Hadoop minotaur, the contenders for the reinvention of BI could rely on the components of the deceased beast to resume the march forward by inventing predictive analytics. The latter simply means that the tools anticipate what the data will look like in a few seconds, a few minutes, a few hours, a few days, a few months. It is the myth of the Trojan Prince Helenos, but in reality: from the intersection of internal signals with past and present external signals, the machine manages to approximate what the future will be with a lower risk of error to what an individual would can calculate. Predictive analysis succeeds in descriptive analysis by taking a big step forward!

Let’s stay in business: predictive analytics allows a point of sale manager to anticipate the traffic in his store, to adjust his advisor needs; predictive analysis makes it possible to know the number of employees who will be on sick leave the next day, through the history, the calendar, the data on the spread of epidemics in the areas of interest, etc. ; in short, predictive analysis will soon be everywhere and that is a good thing, because it already stimulates our intuition, without being able to guarantee the result by constructing the laws of nature: predictive analysis allows us to grow in our decision-making functions .

Prescriptive analytics beyond predictive analytics

Prescriptive analytics goes beyond predictive analytics: it not only informs decision making, it even automates. For example, the restaurateur, whose artificial intelligence predicts 100 place settings, will see its delivery of fresh produce automatically adjusted to minimize food waste, preserving natural resources while preserving operating margins. Or artificial intelligence monitors networks of sensors in buildings to automatically control equipment to maximize user comfort while optimizing energy and fluid expenditure. Prescriptive analyzes are here today: all the ingredients are on the workbench and thousands, millions of organizations write recipes for them every day. The ones they write? These are the data scientists, you know, the ones that pick up our companies because they’re better programmers than statisticians, and better mathematicians than developers.

Cognitive analysis symbolizes the destiny where machines and humans interact in symbiosis. To illustrate, let’s take the case of transfers between an autonomous vehicle and its driver: research shows that this stage is complex, for example because the driver’s senses and muscles are numb, so this loop return brings the crew and surrounding vehicles in danger, for example in the event of an emergency manoeuvre. Let’s be clear: we’re not there yet, with cognitive intelligence. Remember, however, that cognitive analysis aims to enable the machine to understand and – in our case – to adapt its behavior, not only to behavior, but also to feelings, feelings, human emotions. And in France, we have the ecosystem and economic, political and technological dynamism needed to bring the future of BI, that is, cognitive analysis, from the science fiction novel to the great 8pm news story. We did this for the descriptive analysis.

Leave a Comment