point
Menu
Magazines
Browse by year:
Head Peering over the event horizon
Ram Menon
Friday, February 1, 2008
As human beings we intuitively understand the importance of events. But in an IT environment, understanding business events – and analysing what actions should be taken as a result of a certain train of events occurring – is a new development. The logical pathways which lead an application to interpret meaning are still being formed, and the understanding of how certain activities should ensue is growing.

A more complete understanding of the event stream comes at the intersection of the real-time event-driven architecture and the traditional world of business intelligence. It’s a border between two territories that we are only just beginning to cross, but it’s clear that business intelligence is going to play a key role in the predictive business. And it requires a completeness of vision and a level of usability that few solutions offer today.

So while event processing vendors have concentrated on looking for recognized patterns in the real-time event stream, the BI vendors’ approach tends to focus on looking at historical data, which has often been likened to driving a car by looking in the rear-view mirror. Bring those two worlds together though, and you have a powerful new capacity for exploratory analysis in real-time. And make that functionality easy enough for business users to pick up and play with and for perhaps the first time, you are handing the tools to the people who really understand what’s happening, can react accordingly and plan for what may happen next.

Traditional BI products have often been used as a data source, with business users running reports, then dumping the data into Excel for further analysis in an environment they feel comfortable with. Once business people are given an intuitive environment that quickly spots patterns and relationships, and learn more about the application they want to monitor, then they will be in a position to interact directly with the data and build the rules themselves.

The next generation of business management must focus on this vision of being able to react (or predict), not just to today’s pressures but to understand what’s coming next and plan accordingly. There are some fundamental flaws in the approaches businesses are using to build applications that spot patterns in the events around and activities of their customers, employees and suppliers. These come firstly because of their inability to combine the right historical data with real-time events and secondly because most BI tools lack the rich and rapid discovery capabilities needed by business users.

Next-generation business intelligence sits well with the power to predict because it can improve the quality and richness of what ought to be monitored, integrated and connected, and the patterns IT and business systems should be looking for. And when particular conditions show that things are happening, it can be used to discover and diagnose why they occurred and offer a path to corrective action. Used correctly, it can increase our understanding of what has happened, then feed back into the model to form a virtuous circle.

Fighting fraud


Take the example of a fraud application for a bank. An event-driven approach today will monitor activity and if it spots a set of activities that matches a set pattern of fraudulent activity, will kick off an exception process to go and manually discover whether in fact the activity was in some way illegal. The trouble is that fraudsters will quickly change their patterns of operating to stay ahead of this kind of detection. And as fraudulent activity increases, so does the number of instances that require manually sorting at the end of the trading period. This quickly becomes prohibitive.
With a much deeper level of functionality, an application can go one step further. But that requires:

* Business intelligence that takes data and maps different relationships to spot particular scenarios
* Bn integration infrastructure where multiple sources of data can be pulled together, including traditional data warehouses
* A complex event processing architecture to look for sophisticated patterns of activities
* And business process management to kick off both machine and human oriented exception processes
* Rich analytics capabilities to empower end users to ask and answer any question.

Business intelligence plays a vital role in generating discoveries in information, new insights, opportunities and risks in various data sources, then subsequently determining which of those models appeared to be of value and, downstream of the application, pooling all of the data together from multiple sources to see what has actually happened.

Analytic applications are also able to determine different scenarios that are worth modelling and by integrating with a lot of different data sources come up with a set of hypotheses that look for relationships between data and the different patterns that occur that might indicate fraudulent activity. That analysis can be fed into a series of rules that are running on a product line which are monitored as trading activity happens. Then when certain conditions are met, the system can trigger a number of exception processes. One of these processes might initiate a series of analysis sessions that look at particu3lar situations to determine whether they were actually fraudulent and further inform the model.

Intelligent business


The trouble is that traditional business intelligence applications are rarely up to the job. They tend to rely on data warehouses or data cubes, and routines run nightly to crunch data so that systems can easily access it in a fast, supposedly on-line manner in preformatted reports.

That’s fine in a production reporting environment, where they allow users to drill into a particular path in the data, following logical drill down paths. Using a sales example, the user might want to look at national sales, then regional, territory, rep sales, and finally drill into an individual’s performance. But at any point in that analysis if the user wanted to know what was affecting sales and look at other variables that weren’t prepared for them, they would need to go back to the IT department and ask them to add some data for analysis or reconfigure a new data cube for them. In which case the user is held up for hours, if not days or weeks.

Traditional BI systems are architected this way because they assume that users are interested in a certain set of predefined variables and are looking at known issues and questions. But take this into a more complex environment, and users don’t necessarily know the questions they want to ask, nor the data sources they want to interrogate. They want to be able to pull different data sources into their analytic application on the fly and easily visualise the data that the system is presenting to them. And, needless to say, they want to do it fast. In addition, the predictive business wants to put these tools into the hands of business users in their day-to-day work without having recourse to the IT department whenever they want to add a new data source.

We’re entering a new era of pervasive business intelligence which will see BI everywhere as it relates to users – with experiences as simple and intuitive as using Google Earth or iTunes – and BI everywhere as it is built into processes, systems and applications. That’s a powerful combination.

Twitter
Share on LinkedIn
facebook