Archive for the ‘Data Mining’ Category

The Healthcare IT market is expected to grow at a CAGR of about 24% during 2012 to 2014. What other sectors can boast such a rate in this economy? It is anticipated to be a $40 billion industry by the end of 2011.  Why?

In a time when most spending is being frozen or sharply reduced there can only be one reason for increased investment in HIT. It’s the same reason that has always driven strong spending- the expectation of a significant return on investment.

The urgency for significant HIT adoption is unavoidably clear:

  • The cost of healthcare is rising too fast for traditional containment approaches.
  • Even with healthcare reforms, too many people cannot afford health insurance and,
  • Federal and State programs cannot absorb the increased cost of the uninsured in their existing aid programs.
  • Hospitalization costs continue to skyrocket
  • While costs are rising, the quality of care is not.

While many debate how to implement HIT, virtually no one debates the need for its adoption throughout the health sector. The current system has been broken for quite some time and is still hemorrhaging billions of tax dollars.

The latest health information technologies hold the promise of a truly transformed future…a future that was previously impossible just a decade ago:

  • EHRs can reduced the costs of information management
    • Current, uniform patient data becomes accessible in real-time wherever the patient is being treated.
    • Duplicate testing becomes minimized
    • Analytics on data-rich patient information yields
      • better informed care decisions
      • improved outcomes
      • lower treatment costs
    • Sophisticated analysis of massive patient databases yields
      • Superior management of drug safety and effectiveness
      • Rapid identification of expensive, less likely to succeed treatments techniques and more

These are just a few areas of improved care and reduced cost.

It’s not a question of do we adopt state of the art HIT but one of how aggressively we pursue deployment now.

Faculty-Researchers at Harvard Medical School (HMS) practicing at Brigham & Women’s Hospital (BWH) Division of Pharmacoepidemiology and Pharmacoeconomics have Netezza’s TwinFin™ data warehouse appliance as their platform for advanced analytics. Their choice of this technology is especially important at a time when many other stakeholders in drug safety and effectiveness (DSE) are planning to upgrade technology. Harvard and BWH have been leaders in pharmacoepidemiological & pharmacoeconomic research since the 1990’s.  The lab chief, Dr. Jerry Avorn, is the well-known author of “Powerful Medicines: The Benefits, Risks and Costs of Prescription Drugs”.

Dr. Sebastian Schneeweiss, is the Director for Drug Evaluation and Outcomes Research and Vice Chief of the Division of Pharmacoepidemiology and Pharmacoeconomics at Brigham and Women’s Hospital. The Harvard team of researchers is considered to be an industry bellwether:  Here are some of the needs evaluated by the technical lead, Dr. Jeremy Rassen and the sophisticated data mining faculty:

  • Computationally intense rapid analysis of claims data—and in the future E.H.R. data—that keeps pace with expanding data input
  • Capabilities for in-database analytics
  • Ability for accelerated testing of new algorithms
  • A system that facilitates automation of continuous drug safety and effectiveness monitoring
  • Simplicity of use that minimizes the often bottlenecking need for IT support and database administration

Simplicity of use is especially critical since other technologies often require significant set up and technical support time, both of which can seriously delay the outflow of much needed DSE information to groups involved in Health Economics and Outcomes Research (HEOR), Pharmacovigilance, and Epidemiology.

Dr. Schneeweiss is particularly interested in:

“…comparative safety and effectiveness of pharmaceuticals and biotech products, drug policy and risk management program evaluation, and epidemiologic methods using electronic healthcare databases.”

As such, he expects the use of Netezza technology will help expedite the delivery of timely DSE data and ultimately enhance the ability of care providers to act more quickly and effectively on behalf of patients.

We at Netezza are excited that our collaboration with these notable HMS faculty/researchers has already led to leveraging IBM research development efforts and existing products toward revolutionizing computational pharmacoepidemiology. Advanced research tools for pharmacoepidemiology carry with them the prospect of improved Drug Safety and Effectiveness on a global scale.

Outcomes Analysis is nothing new to healthcare payers. Like any other business, health insurers need to know whether their expenditures yield the best possible results for the health of policy holders and the ‘health’ of their ongoing corporate enterprise. Unlike other businesses however, health payers are operating in the volatile reform-focused arena of medical care. Treatment decisions are quite naturally not in their hands but in the hands of physicians whose diagnoses and applied responses to symptoms still vary dramatically. In addition, payers will now have to deal with new federal guidelines governing medical loss ratios (MLR):

CMS is supposed to work with state insurance regulators to make sure that insurers spend at least 85% of the premiums collected from large groups and at least 80% of the premiums collected from small groups and individual insured’s on medical costs.”

 

Last year’s healthcare reform bill (ACA) included the formation of the Patient Centered Outcomes Research Institute (PCORI). The new institute has a lofty mission:

“To assist decision makers by advancing the quality and relevance of evidence concerning the manner in which diseases, disorders, and other health conditions can effectively and appropriately be prevented, diagnosed, treated, monitored, and managed through research and evidence synthesis that considers variations in patient subpopulations, and the dissemination of research findings with respect to the relative health outcomes, clinical effectiveness, and appropriateness of the medical treatments, services, and other items”

 

Dr. Scott Ramsey of the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) points out that only very large organizations such as Kaiser Permanente & Geisinger Health can and have made the significant investments required to carry out the sort of in depth, IT intensive outcomes analysis needed to match the mission. Indeed, for the majority of payers some method of public/private funding to carry out outcomes research will be essential and is built in to the plan:

“The law stipulates that PCORI will be funded initially through federal subsidy and over time by a mixture of federal funds and a tax on public and private health insurers. By 2014, available funds will be to equal $150 million plus an annual $2 fee per Medicare beneficiary transferred from the Medicare Trust Fund, and an annual $2 fee per-covered-life assessed on private health plans adjusted for health expenditure inflation…”

 

We know from an increasing number of public & private research partnerships (such as those between pharma and universities) that improved patient treatment outcomes are already emerging. But are there financial benefits to be gained by payers who pursue outcome studies? After all, private healthcare insurance must remain a profitable business if it is to remain at all.

The answer is yes. Performing outcomes analysis to find providers with the most effective treatments is the best incentive for payers to migrate their insured clients to those providers. Doing so bolsters economic assurance that payer MLR’s will meet the guidelines set by the ACA while simultaneously delivering the most successful patient care. Higher rates of treatment success ultimately emerge in the form of reduced payer expenses.  The mechanism for realizing such benefits rests in the application of the broad range of healthcare IT software and services that are gradually transforming virtually every aspect of delivering medical care.

We are in the midst of a complex merger…a merger which revolves around the dramatic improvements of IT and analytics. These advances are being applied with increased intensity from drug safety and effectiveness to more personalized medicine using EHR’s…from pharmacoeconomics to fraudulent claims discovery and in every other sector of healthcare performance.

Whether seen from the perspective of the patient, the physician/provider or the payer, superior treatment results that emerge from diverse, steadily pursued outcomes research can only result in benefits for all aspects of the healthcare sector.

A December interview in The Economist with IBM’s Global Director of Healthcare Marketing, Andrea Cotter, highlighted the pivotal role of EHR’s in transforming healthcare. This comes simultaneous with a study this month showing better financial performance for providers using EHRs

 

Ms. Cotter explains: “…access to critical health information must be simplified, streamlined and automated to reduce costs and improve service. Electronic health records are the foundationof this transformation, the basic building blocks of health-care efficiency. When standardised and shared, EHRs provide a powerful means of increasing accuracy and speeding the delivery of patient information to the point of care. They enable stronger collaboration, more complete records and better service. And they serve as the enabler of other health-care IT, such as analytics and predictive modeling…”

 

The current healthcare structure is debilitated by fragmented data residing in uncoordinated and inefficient systems. These systems cannot integrate to assist caregivers in their efforts to deliver improved treatment or more efficient cost management. Instead, these disconnected segments in the present structure can become breeding grounds for fraudulent billing schemes, waste, duplication and error at multiple levels.

The good news is that collecting, storing and analyzing high quality patient data is now a fully achievable task and the crucial process of developing/deploying a unified system for sharing the data is well underway. Connected medicine, an impossibility a decade ago, is now healthcare’s imminent destination. To get there the system needs to:

  • Fully digitize patient data and create the EHR’s
  • Develop a universal vocabulary for data exchange
  • Standardize methods for sharing & protecting the data
  • Apply state of the art analytics for interpreting data

As digitizing of patient data comes together in detailed EHRs and a common language for health information exchange is developed a new and encouraging view of healthcare comes into focus.

More and more significant collaborative efforts are following examples like that of UPMC’s $8 billion global health enterprise and its Center for Connected Medicine. The corrective potential of a fully integrated healthcare system is enormous. With patient-care at its core, connected medicine holds the promise of redefining how treatment is managed, delivered & advanced, all within a cost efficient structure.

It’s been a year of steady and encouraging progress in the fight against fraud and waste in the healthcare sector. The Inspector General’s office reports an expectation of significant total savings in many sectors for fiscal 2010, with a noteworthy $3.8 billion coming from investigative receivables.

 

“…We are particularly encouraged by the success of our partnerships with HHS and the Department of Justice through the Health Care Fraud Prevention and Enforcement Action Team (HEAT),” Inspector General Daniel Levinson said in a news release.

 

Use of advanced predictive analytics in preventing fraud & waste is gaining a prominent role in the battle. At a recent healthcare fraud prevention summit HHS Secretary Kathleen Sebelius and Attorney General Eric Holder announced that the CMS will be acquiring new analytic tools. The CMS is soliciting for “state of the art, fraud-fighting analytic tools to help the agency predict and prevent potentially wasteful, abusive, or fraudulent payments before they occur.” It’s the “before they occur” aspect of the analytics that brings the greatest potential for arresting and reducing the sharply rising costs in the current healthcare system.

The CMS, with new and expanded authority,
Will be able to take anti-fraud action before a claim is paid.

 

“By using new predictive modeling analytic tools we are better able to expand our efforts to save the millions – and possibly billions – of dollars wasted on waste, fraud and abuse.” said CMS Administrator Donald Berwick, M.D.

 

Customizable analytics software & services such as IBM’s Fraud and Abuse Management System is providing the state of the art solutions now needed. ‘FAMS’ is capable of sorting through information on tens of thousands of providers and tens of millions of claims in a matter of minutes…creating suspicion indices using almost 1,000 behavioral patterns in a wide range of specialties. The highly customizable system yields rapid results as the analytic modeling tools reveal potentially fraudulent activity, waste, mis-management of funds and other sources of loss. The software, tools and services needed to combat fraud and plug many other fiscal leaks in our ailing healthcare system are ready for frontline deployment.

The nascent trend of five years ago is rapidly becoming the model of today. More & more pharma research is focused on joining forces with universities. The rationale is simple and brilliant; With the ever escalating costs of R & D and the ‘patent cliff’ fast approaching, the merger of resources is a natural wellspring of mutual benefits.

Big pharma needs new drug discoveries and more cost effective ways of discovering them. 80% of all FDA approved drugs have generic counter-parts, according to the 2010 Kaiser Foundation report on prescription drug trends. Add to this the fast approaching edge of the “patent cliff” (2011-2015) when dozens of brand name drugs go off patent, including six of the ten largest medicines in the U.S. and you get a good idea of the challenges facing the ‘business’ of big pharma. Viagara, Actos, Symbicort, Crestor and Avandia are just a few major brands to face generic competition soon.

Partnering with universities will offer big pharma alternate approaches to their on-going research…access to new and experimental technologies, creative thought processes indigenous to the university environment…more cost efficient continued development of in-license drug candidates…fresh stimulus for stalled projects…the potential of discovering multiple, new applications for existing drugs…all in an arena that could offer new, more rapid research platforms for the discovery and release of better medicines.

In this win-win collaboration, universities will be able to analyze pharma’s extensive and diverse data…and data is what it’s all about in the research and development of new drugs. An example of this trend is Sanofi’s recent announcement to collaborate with Harvard in diabetes and cancer research.  As pharma gleans new & improved information from institutional partners, so too do those institutions gain precious access to pharma’s previously locked treasure chest of health science research.

It’s a natural collaboration, taking place on a global scale. A marriage of necessity expected to bring forth a new generation of blockbuster progeny.

The most populated country in the world is fully engaging the issues of healthcare reform. The cornerstone of the newly emerging system is patient data. The organization, storage and management of health records will ultimately maximize the benefits of patient care and cost efficiency.
The WSJ underscores the importance:  “China’s health-care IT market will see remarkable growth in the next five years, triggered partly by China’s three year” health-care reform program, said Janet Chiew, analyst for research firm IDC. IDC estimates the market will reach $2.4 billion in 2013 and grow at an average 19.9% per year.”

Data storage and analysis capabilities will be further driven as the influx of new medical devices and diagnostic equipment continues to surge ahead. According to one report, China’s overall medical equipment market is expected to double between now and 2015 to reach over $53 billion.  This steady increase of new medical equipment will generate massive amounts of new patient data and a concurrent need for real-time access. It is not inconceivable that patient data could more than double in a decade.

Before the information from new diagnostic equipment can be transformed in to cost saving health-care analytics, the existing systems of paper based patient records must become electronic records. This is the first step for eliminating expensive redundancies and delays in gathering patient data. Database technology and storage solutions from IBM and others are already being deployed throughout China and have been for some time.

In Guangdong province, a group of high volume hospitals are implementing a program called CHAS, or Clinical and Health Records Analytics and Sharing. One such hospital focused on traditional Chinese medicine has more than 10,000 patient visits per day. Deployment of the new health-care analytics technology in this hospital is expected by the year-end.

China’s health-care reform offers data storage and analysis technology its largest opportunity yet for developing and deploying solutions…solutions that will have global impact on reducing costs and improving patient care.

The day when a cost efficient technology for reading and sequencing DNA is available may be closer at hand. Intense research is focused on refining the most promising techniques. While IBM’s DNA Transistor Technology is in the forefront, efforts at Oxford Nanopore, Sandia National Labs and elsewhere are validating the trend.

The goal is to reduce costs to where an individuals’ complete genome can be sequenced for between $100 and $1,000. Once this becomes a reality the impact could be so significant as to create a brand new generation of health care capabilities. While there is no way to predict exactly how fast this technology will become available to the average researcher, Manfred Baier, head of Roche Applied Science maintains an optimistic position:

“We are confident that this powerful technology…will make low-cost whole genome sequencing available to the marketplace   faster than previously thought possible”

The technology involves the creation of nanometer sized holes in silicon based chips and then drawing strands of DNA through them. The key rests with forcing the strand to move through slowly enough for accurate reading and sequencing. In this case researchers have developed a device that utilizes the interaction of discrete charges along the backbone of a DNA molecule with a modulated electric field to trap the DNA in the nanopore. By turning these so named ‘gate voltages’ on and off scientists expect to be able to slow and manipulate the DNA through the nanopore at a readable rate. The effort combines the work of experts in nanofabrication, biology, physics and microelectronics.

A clarifying cross-section of this transistor technology has been simulated on the Blue Gene supercomputer. It shows a single-stranded DNA moving in the midst of (invisible) water molecules through the nanopore.

No matter how long it takes for the technology to become a cost-effective reality, it will be a true game-changer when achieved. While researchers express both optimism and caution on the timing, there is one inevitable result for which keen observers in related fields are preparing. When peoples’ individual genetic codes can be economically deciphered and stored, the amount of data generated will be massive. The consequent demands on data storage, mining and analytics will in turn generate their own new challenges.

Using the huge influx of new data to make more informed life science decisions is a key, long-range benefit of the current research efforts in sequencing technology. In health science alone revolutionary new approaches are expected to allow:

  • Early detection of genetic predisposition to diseases
  • Customized medicines and treatments.
  • New tools to assess the application of gene therapy
  • The emergence of DNA based personal health care

An equally critical benefit is the potential cost savings expected when sequencing technology, data storage and advanced predictive analytics combine allowing truly preventive medicine to take its place as the new foundation of health care.

A recent article in the WSJ once again highlights the steadily growing list of applications for predictive analytics in health/life science. This time the goal is identifying patients likely to stop their medication regimen before they do so.

According to a report from the nonprofit New England Healthcare Institute, an estimated one half to one-third of Americans don’t take their medications as prescribed by their doctors…contributing to about $290 billion a year in avoidable medical spending including excess hospitalization.

Is it any wonder with that level of cost at stake along just one vector in health care science that the demand for the best in predictive analytics is becoming more and more critical?

Significant cost savings across the entire spectrum of health care science, as well as more individualized service options for patients, are expected as inevitable results of the steady powering-up now seen in predictive analytics. There are many opportunities to review the positive results in applied case studies. These provide just a glimpse at the successes ahead.

Take for instance the results of SSPS technology and software at Texas Health Services, where the challenge was to limit health care costs without reducing the quality of service provided to patients. As Texas Health Services relates:

 

“With IBM SPSS Statistics Base, Texas Health Resources has greatly enhanced its ability to support its process-improvement initiative. Today, it not only detects process variations, but can determine the underlying causes such as a sicker-than-normal patient population.”


Data quality was improved while data mining costs were reduced by 50 percent.

These high yield results make it clear why developments in predictive analytics have become so worthy of front and center news for so many segments of the health care industry.  Other endeavors are focused on identifying those individuals most likely to develop specific illnesses such as develop diabetes or cancer. Ingenix, a unit of United Health Group, and a customer of Netezza, has already successfully launched an effort to mine and analyze the underlying risk factors in patient data before an illness develops or advances.

As health care science, quality patient care and advancing medical technology struggle with cost factors, the value of predictive analytics in lowering costs increases exponentially.

In the Introduction of Tom Peters’ new business book The Little Big Things, he mentions, perhaps unwittingly, a trend in what management consultants were focused on by decade.  He writes about being passionate on a number of things, including“…scintillating customer service (I pretty much had that “space” all to myself in the mid-1980s—believe it or not—“everybody” was doing quality, I was doing service)…”.  This tiny mention speaks to a macro trend that drives an argument that our businesses will have a huge focus on customer analytics in the coming decade.  Where businesses have focused on product analytics in the past, tomorrow’s analytics will predominately focus on customer.

Imagine the following assumptions.  First, what if management consultants were a leading indicator or even a causal factor into how managers decided to analyze their businesses.  (N.B. One can even ignore causality and say ‘It doesn’t matter if widely published and listened to consultants and visionaries talking about a concept drives use of the concept, or whether the concept is right and its wide usage drives consultants to talk about it.’ ) The fact is Peters marked a trend—the topic of the 80’s was quality for the most part.   Perhaps the mass of consultants talking quality—a product attribute—drove companies to develop analytics around product…but this implementation took ten years.  Thus the 1990s was the decade of quality analytics.  Fast forward to the 2000s.  The 2000s were the decade of management consultants talking customers.  Which means the 2010s will be the decade of customer analytics.

As an aside, if you want to know what analytics will occur in the 2020s for most companies, they’ll be about interactions.  Today, innovation has just begun in collecting huge amounts of interaction and social networking data.  The bleeding edge ‘interactions companies’ (which happen to mostly be Internet companies as they happen to have easiest access to a paradigm to drive the most interactions per second and have recordable data on the interactions) are just starting to think about how to analyze interactions as a marker of the health of their business and offerings.  It follows that once the ‘interactions companies’ work out best practices by the middle of the 2010s and they start to seep into leading edge companies, by the 2020s you’ll see early and late majority companies investing heavily into recording and analyzing interactions.  These interactions will be employee-to-employee, management-to-employee, customer-employee, customer-customer, influencer-customer, and on and on.  More ought to be considered on this topic.  And one should not forget that if you translate Tom Peters’ passion for service as a proxy for interaction, you can see he’s about 20 years ahead of his peers….

Back to customer analytics, n interesting piece of anecdotal evidence showing the rise of focus in customer analytics follows.  Performing a search on Amazon.com for books with the subject of ‘analytics’ shows a steady growth of books by decade: 2,400 come up published in the 1980s; 4,000 come up published in the 1990s; and 5,500 come up as published in the 2000s–a steady rise.  However, change the search term to ‘customer analytics’ and it brings back 0 results for the 1980s, 1 result for the 1990s, and 109 results for the 2000s.  The curve of acceleration for books mentioning or about customer analytics is exponential.

Assuming the focus on customers of the 2000s will lead to customer analytics in the 2010s, what does this mean for the workers, managers and executives focused on using, investing in, or building these analytics?  It means

a)      There’s a good chance you’ve had some exposure to customer analytics, although it’s likely failed, given that the majority of business intelligence projects fail—unless they make use of a purpose-built data warehouse appliance,

b)      There’s a good chance the more dashboards and leading indicators you have been making include customer as a part of a key performance indicator, measure, or attribute,

c)      The momentum toward predictive analytics mixed with this emerging focus on customers should be driving a rise in interest in behavioral economics…which it is: meaning you’ll be exposed to more service offerings and interest in predicting customer behavior,

d)      Your analytic technology will need to grow and perform with much more data.  It’s a rare company that has fewer customers than products and stores.  The business intelligence technologies of the past were about analyzing combinations of products and stores.  It’s typical that the number of customer greatly surpasses this combination.

There are many other events, trends, and factors that emerge from this focus that will happen around customer analytics.  The chief drivers to success will be to embrace it and be excellent at it, which means one’s method of collection, strategy, and most importantly delivery will be paramount.