Archive for September, 2009

In the 1990’s, very soon after the widespread expansion of standalone business intelligence (BI) software implementations, professionals began talking about extending this power to operational applications.  Operational applications were front-office, back-office, horizontal, ERP—they created purchase orders and instructed plant machines to start and stop and automated customer communications.  All of these software applications that automated and made companies run suddenly became the loci for future instantiations of BI.  And the promise was that once BI was integrated into the very business processes operationalized by this software, that the optimization and smarts would happen automatically.

The promise was not immediately realized.  At first, some believed it was because the operational software providers didn’t ‘know BI’.  However, after a time, these business process software modules included a spectrum of BI modalities.  Everything from hard-coded SQL, to high-science algorithms, to OEM’ing top BI platforms such as Business Objects or Cognos.  No one could say these tools had not stumbled on the ‘right’ way of integrating BI into their workflow.  Even today, we have yet to see the promise of possibility latent in the fact that operational software is helping companies act, but without organized data-based intelligence outside of the pre-coded ‘rules’ of the human operators.

Have we finally gotten it right?  Indeed, after all this time, Gartner in 2009 said in discussing their BI Magic Quadrant:

“Areas that have traditionally been under corporate performance management (CPM), such as business planning and forecasting, are increasingly being embedded with BI capabilities. This, together with a trend of embedding analytics into business processes, will drive further investment in BI.”

Gartner went further, with a vision that by 2012:

“…business units will increase spending on packaged analytic applications, including corporate performance management (CPM), online marketing analytics and predictive analytics that optimize processes, not just report on them (emphasis mine).

If we’re still not there after all this time, and the vision has been obvious more than a decade ago, then there has to be one or more significant barriers to this happening.  Clearly the barrier is not simply technical.  My hypothesis is that the largest barriers are:

a)      Managers and analysts like the idea of a black box giving them an answer, but not the idea that the answer will be used—without their intervention—to do their job for them.  Professionals would still rather type in the number of widgets to be ordered or destroyed, the amount to budget for x, y, or z, and the number of spin cycles to run the toothbrush vat.  Workers love the idea of BI laying out the answer for them, but it’s as if we still want to copy the right answer onto the test, rather than letting the robot take the test for us.  Maybe it’s because it’s us—not the robot—who gets stuck with the grade.

b)      Our communication culture at work and a continuing dedication to transparency at low levels of grain means the chance that the professional will have to answer the question ‘Why?’ is very real.  One can imagine being in a meeting and being asked: ‘We always order one truckload of tomatoes.  This week you ordered two.  And we had so many left over.  Why did you do that?’  Imagining our answer as ‘I let the computer decide’ doesn’t have a nice ring to it.

What does work in these situations is having a crib sheet at hand—BI output—that influences the worker to make better decisions.  And this is what some of the more innovative BI applications have migrated toward.  There are beginning to be applications that support business processes.  They don’t give you a BI palette and ask you to envision and create a picture as much as they say: you have to paint this room in your house, and here are the questions you will have or ought to have, and their answers.  Instead of giving you a servant asking ‘What should I do’, they give you a vacuuming robot, a making-coffee robot, and a start-the-fire robot.

An example of a software application with this approach is QuantiSense (fair warning: Quantisense is a partner of Netezza).  QuantiSense invented what they call ‘Playbooks’.  Playbooks are just specific work-flows that any retail merchandiser, planner or allocator is or should be doing anyway.  The Playbooks define the workflow, the process of the workflow and the points within the workflow that could be optimized if a BI process were inserted at just that point.  As a result, the regular analyst can run the ‘plays’ you might see on Monday Night Football rather than something closer to a little kid quarterback saying ‘Go Long…’ as the extent of their vision to the right play.  This sitting of a standalone BI app next to the professional and their operational app, with the BI tool coaching them through a workflow—rather than giving them some output—seems to be the only method that will get today’s workers around the barriers that have plagued knowledge workers for a decade.

Any lover of 1980’s music (or anyone who listens to the radio) can work their way through this flow chart.  Guessing what’s going to happen tomorrow or next year looks a little more like this.  What they have in common is movement is iterations of events occurring in time represented graphically.  For data visualization, this continues to be a struggle, how can one show particular variable impacts without a video?  There may be a long wait for improvements toward ‘the perfect dashboard’.

Why is this important?  One example is the Federal Stimulus Package of 2009 has a significant focus on workforce development.  The current approach might surprise someone not in the field.  The granularity in working with individuals is heartening: student information system users want to be able to watch students (in this case workers, often in a community college or other re-training program) longitudinally.  Imagine a graph auto-scrolling to the right showing lines for income, education, hours working per week, employment or other benefits received, language skills—then add lines for macro events such as employment rates in that county, state or country, and other data.  While a more quantitative analyst might want to group students together to look for trends or who are leading or lagging indicators, the individual educators, program managers, Department of Labor professionals and others want to be able to see time passing—a dashboard ‘case history’ or ‘life story’ of helping an individual succeed in transitioning from the old economy to the new, from an older life to a newer one.

How will it look?  While there are amazing folks out there doing static knowledge visualization, this is different, and it’s harder to find folks doing it well.

–          There’s nothing wrong with using the old clock metaphor.  We see clocks so many times that it’s surprising more analysts and presenters don’t leverage the icon burn-in.

–          Traditional bar graphs can be functional, but again it forces on to move their eyes and mind to the right to express the flow of time.  Tufte made this representation famous, while the time aspect is more similar to bar graphs where one imagines time progressing, we can see relative changes in space and other variables in time.

–          Proper respect goes to my old company MicroStrategy for using the latest Web 2.0 technologies to produce cool dashboards like this (including a longitudinal scroll bar of sorts in the upper right) which, because of their large partner base, are commonly seen in analytic software now.

–          Imagine a static print view of the image seen in this fun video at about 1 minute 8 seconds in with the cascading, serial flow from the printers.  The cascade concept, whether here or elsewhere, leverages the natural burn-in from our experience with gravity and entropy: things tend to go down, and they tend to move from the start in any direction they can flow (in a graph, the y axis is symbiotically a wall, and moving across x is the only direction anything can flow).

–          We’ll probably have to rely on manually scrolling for a while, as one can do at the bottom of this dashboard.

–          The really nice solution can be seen here at 2:15 in the video (even better with tracers at 3:20) which is brought to you by, guess who, Google.