Archive for the ‘BI’ Category

The Healthcare IT market is expected to grow at a CAGR of about 24% during 2012 to 2014. What other sectors can boast such a rate in this economy? It is anticipated to be a $40 billion industry by the end of 2011.  Why?

In a time when most spending is being frozen or sharply reduced there can only be one reason for increased investment in HIT. It’s the same reason that has always driven strong spending- the expectation of a significant return on investment.

The urgency for significant HIT adoption is unavoidably clear:

  • The cost of healthcare is rising too fast for traditional containment approaches.
  • Even with healthcare reforms, too many people cannot afford health insurance and,
  • Federal and State programs cannot absorb the increased cost of the uninsured in their existing aid programs.
  • Hospitalization costs continue to skyrocket
  • While costs are rising, the quality of care is not.

While many debate how to implement HIT, virtually no one debates the need for its adoption throughout the health sector. The current system has been broken for quite some time and is still hemorrhaging billions of tax dollars.

The latest health information technologies hold the promise of a truly transformed future…a future that was previously impossible just a decade ago:

  • EHRs can reduced the costs of information management
    • Current, uniform patient data becomes accessible in real-time wherever the patient is being treated.
    • Duplicate testing becomes minimized
    • Analytics on data-rich patient information yields
      • better informed care decisions
      • improved outcomes
      • lower treatment costs
    • Sophisticated analysis of massive patient databases yields
      • Superior management of drug safety and effectiveness
      • Rapid identification of expensive, less likely to succeed treatments techniques and more

These are just a few areas of improved care and reduced cost.

It’s not a question of do we adopt state of the art HIT but one of how aggressively we pursue deployment now.

Faculty-Researchers at Harvard Medical School (HMS) practicing at Brigham & Women’s Hospital (BWH) Division of Pharmacoepidemiology and Pharmacoeconomics have Netezza’s TwinFin™ data warehouse appliance as their platform for advanced analytics. Their choice of this technology is especially important at a time when many other stakeholders in drug safety and effectiveness (DSE) are planning to upgrade technology. Harvard and BWH have been leaders in pharmacoepidemiological & pharmacoeconomic research since the 1990’s.  The lab chief, Dr. Jerry Avorn, is the well-known author of “Powerful Medicines: The Benefits, Risks and Costs of Prescription Drugs”.

Dr. Sebastian Schneeweiss, is the Director for Drug Evaluation and Outcomes Research and Vice Chief of the Division of Pharmacoepidemiology and Pharmacoeconomics at Brigham and Women’s Hospital. The Harvard team of researchers is considered to be an industry bellwether:  Here are some of the needs evaluated by the technical lead, Dr. Jeremy Rassen and the sophisticated data mining faculty:

  • Computationally intense rapid analysis of claims data—and in the future E.H.R. data—that keeps pace with expanding data input
  • Capabilities for in-database analytics
  • Ability for accelerated testing of new algorithms
  • A system that facilitates automation of continuous drug safety and effectiveness monitoring
  • Simplicity of use that minimizes the often bottlenecking need for IT support and database administration

Simplicity of use is especially critical since other technologies often require significant set up and technical support time, both of which can seriously delay the outflow of much needed DSE information to groups involved in Health Economics and Outcomes Research (HEOR), Pharmacovigilance, and Epidemiology.

Dr. Schneeweiss is particularly interested in:

“…comparative safety and effectiveness of pharmaceuticals and biotech products, drug policy and risk management program evaluation, and epidemiologic methods using electronic healthcare databases.”

As such, he expects the use of Netezza technology will help expedite the delivery of timely DSE data and ultimately enhance the ability of care providers to act more quickly and effectively on behalf of patients.

We at Netezza are excited that our collaboration with these notable HMS faculty/researchers has already led to leveraging IBM research development efforts and existing products toward revolutionizing computational pharmacoepidemiology. Advanced research tools for pharmacoepidemiology carry with them the prospect of improved Drug Safety and Effectiveness on a global scale.

Outcomes Analysis is nothing new to healthcare payers. Like any other business, health insurers need to know whether their expenditures yield the best possible results for the health of policy holders and the ‘health’ of their ongoing corporate enterprise. Unlike other businesses however, health payers are operating in the volatile reform-focused arena of medical care. Treatment decisions are quite naturally not in their hands but in the hands of physicians whose diagnoses and applied responses to symptoms still vary dramatically. In addition, payers will now have to deal with new federal guidelines governing medical loss ratios (MLR):

CMS is supposed to work with state insurance regulators to make sure that insurers spend at least 85% of the premiums collected from large groups and at least 80% of the premiums collected from small groups and individual insured’s on medical costs.”

 

Last year’s healthcare reform bill (ACA) included the formation of the Patient Centered Outcomes Research Institute (PCORI). The new institute has a lofty mission:

“To assist decision makers by advancing the quality and relevance of evidence concerning the manner in which diseases, disorders, and other health conditions can effectively and appropriately be prevented, diagnosed, treated, monitored, and managed through research and evidence synthesis that considers variations in patient subpopulations, and the dissemination of research findings with respect to the relative health outcomes, clinical effectiveness, and appropriateness of the medical treatments, services, and other items”

 

Dr. Scott Ramsey of the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) points out that only very large organizations such as Kaiser Permanente & Geisinger Health can and have made the significant investments required to carry out the sort of in depth, IT intensive outcomes analysis needed to match the mission. Indeed, for the majority of payers some method of public/private funding to carry out outcomes research will be essential and is built in to the plan:

“The law stipulates that PCORI will be funded initially through federal subsidy and over time by a mixture of federal funds and a tax on public and private health insurers. By 2014, available funds will be to equal $150 million plus an annual $2 fee per Medicare beneficiary transferred from the Medicare Trust Fund, and an annual $2 fee per-covered-life assessed on private health plans adjusted for health expenditure inflation…”

 

We know from an increasing number of public & private research partnerships (such as those between pharma and universities) that improved patient treatment outcomes are already emerging. But are there financial benefits to be gained by payers who pursue outcome studies? After all, private healthcare insurance must remain a profitable business if it is to remain at all.

The answer is yes. Performing outcomes analysis to find providers with the most effective treatments is the best incentive for payers to migrate their insured clients to those providers. Doing so bolsters economic assurance that payer MLR’s will meet the guidelines set by the ACA while simultaneously delivering the most successful patient care. Higher rates of treatment success ultimately emerge in the form of reduced payer expenses.  The mechanism for realizing such benefits rests in the application of the broad range of healthcare IT software and services that are gradually transforming virtually every aspect of delivering medical care.

We are in the midst of a complex merger…a merger which revolves around the dramatic improvements of IT and analytics. These advances are being applied with increased intensity from drug safety and effectiveness to more personalized medicine using EHR’s…from pharmacoeconomics to fraudulent claims discovery and in every other sector of healthcare performance.

Whether seen from the perspective of the patient, the physician/provider or the payer, superior treatment results that emerge from diverse, steadily pursued outcomes research can only result in benefits for all aspects of the healthcare sector.

The expectation of a reformed healthcare system driven by new technology is firmly established.  The recently released IT Industry Business Confidence Index cites advances in healthcare as a key force behind the industry’s optimism.

The foundation for realizing IT based reform is the adoption of EHR’s. The CMS has an incentive program in place motivating care-givers to adopt certified EHR technologies.  The CMS is by far the nations’ largest payer, distributing approximately $800 billion in benefits.

Dr. David Blumenthal, the outgoing National Coordinator for Health IT, says we have now “officially entered the age of ‘meaningful use”.  As of Feb 8th some 18,000 providers have registered to apply for the incentive program aimed at achieving ‘meaningful use’ status. 1,000’s more are expected to register throughout the year.  Meeting the meaningful use criteria is a significant challenge with some critics claiming the process is too complicated and others finding ambiguity in its directives.

According to the HHS, “meaningful use means providers need to show they’re using certified EHR technology in ways that can be measured significantly in quality and in quantity.”  In stage one (2011 & 2012) this means establishing a baseline for data capture and information sharing. The HHS is reaching out to providers to inform them of the detailed steps.

State of the art healthcare IT, like all technologies capable of bringing about revolutionary change, requires much more than the simple acquisition of a new IT system. Those already committed to the journey, such as CMIO Dr. Joel Berman of Concord Hospital (NH) explains:

“Other essential factors include unwavering senior administrative support, engaged clinical champions, dedicated physician and nurse informaticists, effective change management, familiarity with lean principles and practices, enlistment of patients, and commitment to rapid cycle improvement tools and techniques. Eighty percent of the challenges are about people, processes, psychology, and sociology; only 20 percent are about technology…”

As more and more providers commit to the technology and the process of change that comes with it, IT based healthcare reform gains significant momentum. There is no question that everyone wants the benefits that fully applied healthcare IT can ultimately bring:

  • Improved quality of patient care
  • Superior treatment outcomes
  • Increased efficiencies from enhanced data management
  • Cost reduction throughout the healthcare system
  • Anti-fraud management through advanced analytics

The only question is how rapidly providers will adopt the new healthcare IT and fully commit to achieving meaningful use. The indicators so far in 2011 point to steadily increasing commitments.

2010 was a lean year for new drug approvals. The FDA granted permission for only 21 new medications…below both the 2009 and 2008 levels. Some of the most eagerly awaited approvals have been delayed.  Many see the low numbers as a trend

Courtesy of the WSJ

reflecting an increased focus on safety concerns but the FDA states there has been “no systemic change…” in the process of granting approval. Nonetheless, change is most certainly underway when it comes to evaluating safety & effectiveness for proposed and existing medications. 2010  also saw a significant number of drug recalls by the FDA and the agency used its enhanced authority to address issues of labeling and advertising throughout the year as well.

Armed with more legal power through the passage of the FDA Amendments Act (FDAAA) and moving towards improved data management technologies, the agency is targeting changes aimed at improving the drug approval process, responses to adverse events and other key safety segments with its Sentinel Initiative.

The FDA’s Deputy Commissioner Dr. Joshua Sharfstein, (a strong advocate of more stringent safety regulations, who has just recently announced his departure from the Agency) gave a significant update to the House Subcommittee on Health in Spring 2010.

“…FDAAA requires the HHS Secretary to develop methods to obtain access to disparate data sources and to establish a postmarket risk identification and analysis system to link and analyze health care data from multiple sources. On May 22, 2008, FDA launched the Sentinel Initiative with the ultimate goal of creating and implementing the Sentinel System—a national, integrated, electronic system for monitoring medical product safety. The Sentinel System…will enable FDA to actively gather information about the postmarket safety and performance of its regulated products—a significant step forward from our current, primarily passive safety surveillance systems.   The law sets a goal of access to data from 25 million patients by July 1, 2010, and 100 million patients by July 1, 2012…”

Lofty but achievable goals when the right data management technologies are brought to the task. The challenges are significant but with almost half of the population taking at least one prescription drug the value of transforming the FDA from an agency that reacts to adverse events to one that proactively prevents them cannot be overrated.

Obamacare Helps the Middle Man?

Wall St. Journal reports that the ratings agency Fitch had increased the rating of McKesson (MCK), Cardinal Health (CAH) and AmerisourceBergen (ABC).  “Fitch Ratings announced on Thursday increased optimism about drug distributors, noting low debt levels and ongoing growth in the industry, which is expected to be further helped by the federal health-care overhaul.”

Fitch sees “moderate increases” in prescription volumes when Obamacare goes into effect in 2014, and concomitant leverage for MCK, ABC and CAH.  Like others, MCK is keenly aware of the disruptive power and shift in buying due to ObamaCare.  McKesson has been giving monthly updates on the Federal Stimulus, especially as it relates to McKesson’s Electronic Health Record offering.

McKesson has had a particularly good run over the last few months.  Successes include winning the coveted Walmart Supplier of the Year Award; increasing their shareholder dividend by 50%; and showing increased quarterly revenue for the quarter ending June 30, 2010.  McKesson cited strength in its Distribution Solutions Group, the same group that grabbed the Walmart Supplier of the Year Award.  That MCK Distribution Solutions Group is the one that will be able to leverage new buying contracts and volume when ObamaCare goes into place.

Consistent with the patient ‘land-grab’ happening among electronic health record providers including eclipsys, GE Health, Siemens, Cerner, Sage, PracticeFusion and Epic, McKesson took the hit on profit margins as they chose to invest invest more in developing their EHR offering.  They, like others, are keenly aware and caught in the race to position their software as uniquely designed to qualify providers as achieving ‘meaningful use’.  Said their CEO:

“We are pleased that the Department of Health and Human Services released the final rules for meaningful use and certification standards of electronic health record (EHR) systems under the Medicare and Medicaid incentive programs,” said Hammergren. “Providers now have the flexibility to achieve meaningful use by taking different paths to implementing an EHR system based on their needs and priorities, which we believe is critical to broad-based adoption. We remain focused on working with our customers to make sure they have the right resources in place to qualify for stimulus money and improve the quality of care for their patients.”

The future seems to hold a nice mix for McKesson: new, as yet inefficient Rx markets in existing distribution channels, increased Rx volume, and an existing clinical solution driven by ObamaCare stimulus and regulation.  Maybe this time the ratings agencies are getting it right.

It’s an amazing day, the beginning of something great! Netezza Corporation’s (www.netezza.com)  (NYSE:NZ) new Health & Life Sciences division is launching our voice on Facebook, Twitter, LinkedIn, and more.  As the premier analysis engine for all healthcare oriented data, Netezza will be a leading voice in what is now possible as we move toward truly predictive drugs, care, and outcomes.

My friend and colleague Bill Zanine, our Business Solutions Executive for Health & Life Sciences, will be the broadcast presence for where technology meets science meets business in his blog Rx for Analytics.  Bill has spent years as feet on the street and in the elite ivory towers of global healthcare data vision, provision, and utilization.

Where will you find Netezza this year, next year, and beyond?

–        When insurance payers are trying to identify fraud and abuse to pass those savings on to you the consumer, and do it before the fraudster is paid, not after, you’ll find Netezza

–        When drug companies look to gene sequencing post-process analysis to identify which drugs your personalized medicine profile contraindicates, you’ll find Netezza

–        When providers want to query a centralized electronic health record for aggregate analytics on a vector, symptom, drug or outcome, you’ll find Netezza

–        When pharmaceuticals want to make smarter contracts, more effective distribution, and penetrate the opinion ‘cloud’ of influential doctors and academicians to get better drugs to you, you’ll find Netezza

We can make your doctor smarter.  We can make the next drug better.  We can make your insurance cheaper.  We can help them cure cancer faster.  Netezza can do all this.  Netezza can.

In the Introduction of Tom Peters’ new business book The Little Big Things, he mentions, perhaps unwittingly, a trend in what management consultants were focused on by decade.  He writes about being passionate on a number of things, including“…scintillating customer service (I pretty much had that “space” all to myself in the mid-1980s—believe it or not—“everybody” was doing quality, I was doing service)…”.  This tiny mention speaks to a macro trend that drives an argument that our businesses will have a huge focus on customer analytics in the coming decade.  Where businesses have focused on product analytics in the past, tomorrow’s analytics will predominately focus on customer.

Imagine the following assumptions.  First, what if management consultants were a leading indicator or even a causal factor into how managers decided to analyze their businesses.  (N.B. One can even ignore causality and say ‘It doesn’t matter if widely published and listened to consultants and visionaries talking about a concept drives use of the concept, or whether the concept is right and its wide usage drives consultants to talk about it.’ ) The fact is Peters marked a trend—the topic of the 80’s was quality for the most part.   Perhaps the mass of consultants talking quality—a product attribute—drove companies to develop analytics around product…but this implementation took ten years.  Thus the 1990s was the decade of quality analytics.  Fast forward to the 2000s.  The 2000s were the decade of management consultants talking customers.  Which means the 2010s will be the decade of customer analytics.

As an aside, if you want to know what analytics will occur in the 2020s for most companies, they’ll be about interactions.  Today, innovation has just begun in collecting huge amounts of interaction and social networking data.  The bleeding edge ‘interactions companies’ (which happen to mostly be Internet companies as they happen to have easiest access to a paradigm to drive the most interactions per second and have recordable data on the interactions) are just starting to think about how to analyze interactions as a marker of the health of their business and offerings.  It follows that once the ‘interactions companies’ work out best practices by the middle of the 2010s and they start to seep into leading edge companies, by the 2020s you’ll see early and late majority companies investing heavily into recording and analyzing interactions.  These interactions will be employee-to-employee, management-to-employee, customer-employee, customer-customer, influencer-customer, and on and on.  More ought to be considered on this topic.  And one should not forget that if you translate Tom Peters’ passion for service as a proxy for interaction, you can see he’s about 20 years ahead of his peers….

Back to customer analytics, n interesting piece of anecdotal evidence showing the rise of focus in customer analytics follows.  Performing a search on Amazon.com for books with the subject of ‘analytics’ shows a steady growth of books by decade: 2,400 come up published in the 1980s; 4,000 come up published in the 1990s; and 5,500 come up as published in the 2000s–a steady rise.  However, change the search term to ‘customer analytics’ and it brings back 0 results for the 1980s, 1 result for the 1990s, and 109 results for the 2000s.  The curve of acceleration for books mentioning or about customer analytics is exponential.

Assuming the focus on customers of the 2000s will lead to customer analytics in the 2010s, what does this mean for the workers, managers and executives focused on using, investing in, or building these analytics?  It means

a)      There’s a good chance you’ve had some exposure to customer analytics, although it’s likely failed, given that the majority of business intelligence projects fail—unless they make use of a purpose-built data warehouse appliance,

b)      There’s a good chance the more dashboards and leading indicators you have been making include customer as a part of a key performance indicator, measure, or attribute,

c)      The momentum toward predictive analytics mixed with this emerging focus on customers should be driving a rise in interest in behavioral economics…which it is: meaning you’ll be exposed to more service offerings and interest in predicting customer behavior,

d)      Your analytic technology will need to grow and perform with much more data.  It’s a rare company that has fewer customers than products and stores.  The business intelligence technologies of the past were about analyzing combinations of products and stores.  It’s typical that the number of customer greatly surpasses this combination.

There are many other events, trends, and factors that emerge from this focus that will happen around customer analytics.  The chief drivers to success will be to embrace it and be excellent at it, which means one’s method of collection, strategy, and most importantly delivery will be paramount.

In the mid 1990’s, hearing about someone with a 1 terabyte data warehouse (DWH) was a sort of mystical, illusory event, engendering doubt or even suspicion as being a ‘fish that got away’ story. The person telling the story was never the one who actually built the DWH, they were just exposed to it in some way, and they threw the story around as if it was nothing, loving the awed look on the faces of their audience. Invariably this would be someone from the Information Technology (IT) field, since the business users would be unlikely to know, care, or be surprised that a very large amount of data is needed to answer their questions. So the IT person would also carelessly throw out a rejoinder such as ‘You know, at that size, you can’t simply [insert technique IT people do every do with a ‘normal’ large DWH].’
Fast forward a decade. Today, terabyte+ warehouses are common. However, one hears the same stories with one small difference: replace the word terabyte with petabyte . A petabyte, at 1000 terabytes, is a seemingly unreachable stretch of data. However, as we all witness the increasing power of processing and decreased cost of storage, we seem to be seeing enough examples of PB+ warehouses to say, “yesterday’s terabyte is today’s petabyte”.

Before you get a petabyte DWH, you need a petabyte of operational data. When a petabyte of data is present to ‘run’ your business, only then can someone say ‘we need to analyze all this data’. Today’s petabyte-operational business is much more likely to be communication or information based. For example, AT&T reported one year ago that “AT&T currently carries about 16 petabytes of total IP and data traffic on an average business day”. (With our log scale growth in storable communication, presumably it’s on its way to doubling…) Other companies with petabyte businesses include Google, all the major telecommunications companies, all the major web businesses—digital media and telecommunications. It’s nice to know the exception that proves the rule is the PB+ data collection at the Large Hadron Collider.

In a recent conference, a member of Facebook revealed the accelerating growth of their DWH. They reported that in March 2008 they were collecting 200 gigabytes (GB) of data per day. In April 2009, they were collecting 2+ TB of data per day, and in October 2009, they were collecting 4TB+ data per day. If you chart this, you see something approaching a classic logarithmic curve. While Facebook reports its DWH is closing in on 5 PB today, by the time a reader is absorbing this sentence, it has likely long surpassed that.

Does this mean in 2020, more than half of the Fortune 100 will have petabytes size data warehouses? Probably not. However, they’ll all have TB+ warehouses, and a herd of businesses will be PB+:

• All large and mid-size digital media, social media, and web businesses
• Large and mid-size telecommunication firms, driven by their Call Detail Record databases
• Financial market-based companies (think of tracking all stock market transactions to the microsecond level of granularity), and more and more bricks and mortar companies (e.g. banks) who have done as little as dipped a toe into financial markets, social media, streaming communication, and the like.
• Large energy companies recording all seismic and atmospheric ‘communications’ to a very specific latitude/longitude
• The energy grid will be getting close. It’ll be likely that cars are talking to the grid to reduce congestion and to enable metered driving in the fast lane, so chances are the cars will be talking to each other spitting out signals every second.  Just like that, we’ve added another 100M four-wheeled ‘people’ in our country communicating and someone will want to analyze it.

And, you know, when your car’s antenna is a source for an exabyte data warehouse, you can’t just change the wiper blades, you have to……

In the 1990’s, very soon after the widespread expansion of standalone business intelligence (BI) software implementations, professionals began talking about extending this power to operational applications.  Operational applications were front-office, back-office, horizontal, ERP—they created purchase orders and instructed plant machines to start and stop and automated customer communications.  All of these software applications that automated and made companies run suddenly became the loci for future instantiations of BI.  And the promise was that once BI was integrated into the very business processes operationalized by this software, that the optimization and smarts would happen automatically.

The promise was not immediately realized.  At first, some believed it was because the operational software providers didn’t ‘know BI’.  However, after a time, these business process software modules included a spectrum of BI modalities.  Everything from hard-coded SQL, to high-science algorithms, to OEM’ing top BI platforms such as Business Objects or Cognos.  No one could say these tools had not stumbled on the ‘right’ way of integrating BI into their workflow.  Even today, we have yet to see the promise of possibility latent in the fact that operational software is helping companies act, but without organized data-based intelligence outside of the pre-coded ‘rules’ of the human operators.

Have we finally gotten it right?  Indeed, after all this time, Gartner in 2009 said in discussing their BI Magic Quadrant:

“Areas that have traditionally been under corporate performance management (CPM), such as business planning and forecasting, are increasingly being embedded with BI capabilities. This, together with a trend of embedding analytics into business processes, will drive further investment in BI.”

Gartner went further, with a vision that by 2012:

“…business units will increase spending on packaged analytic applications, including corporate performance management (CPM), online marketing analytics and predictive analytics that optimize processes, not just report on them (emphasis mine).

If we’re still not there after all this time, and the vision has been obvious more than a decade ago, then there has to be one or more significant barriers to this happening.  Clearly the barrier is not simply technical.  My hypothesis is that the largest barriers are:

a)      Managers and analysts like the idea of a black box giving them an answer, but not the idea that the answer will be used—without their intervention—to do their job for them.  Professionals would still rather type in the number of widgets to be ordered or destroyed, the amount to budget for x, y, or z, and the number of spin cycles to run the toothbrush vat.  Workers love the idea of BI laying out the answer for them, but it’s as if we still want to copy the right answer onto the test, rather than letting the robot take the test for us.  Maybe it’s because it’s us—not the robot—who gets stuck with the grade.

b)      Our communication culture at work and a continuing dedication to transparency at low levels of grain means the chance that the professional will have to answer the question ‘Why?’ is very real.  One can imagine being in a meeting and being asked: ‘We always order one truckload of tomatoes.  This week you ordered two.  And we had so many left over.  Why did you do that?’  Imagining our answer as ‘I let the computer decide’ doesn’t have a nice ring to it.

What does work in these situations is having a crib sheet at hand—BI output—that influences the worker to make better decisions.  And this is what some of the more innovative BI applications have migrated toward.  There are beginning to be applications that support business processes.  They don’t give you a BI palette and ask you to envision and create a picture as much as they say: you have to paint this room in your house, and here are the questions you will have or ought to have, and their answers.  Instead of giving you a servant asking ‘What should I do’, they give you a vacuuming robot, a making-coffee robot, and a start-the-fire robot.

An example of a software application with this approach is QuantiSense (fair warning: Quantisense is a partner of Netezza).  QuantiSense invented what they call ‘Playbooks’.  Playbooks are just specific work-flows that any retail merchandiser, planner or allocator is or should be doing anyway.  The Playbooks define the workflow, the process of the workflow and the points within the workflow that could be optimized if a BI process were inserted at just that point.  As a result, the regular analyst can run the ‘plays’ you might see on Monday Night Football rather than something closer to a little kid quarterback saying ‘Go Long…’ as the extent of their vision to the right play.  This sitting of a standalone BI app next to the professional and their operational app, with the BI tool coaching them through a workflow—rather than giving them some output—seems to be the only method that will get today’s workers around the barriers that have plagued knowledge workers for a decade.