The Healthcare IT market is expected to grow at a CAGR of about 24% during 2012 to 2014. What other sectors can boast such a rate in this economy? It is anticipated to be a $40 billion industry by the end of 2011.  Why?

In a time when most spending is being frozen or sharply reduced there can only be one reason for increased investment in HIT. It’s the same reason that has always driven strong spending- the expectation of a significant return on investment.

The urgency for significant HIT adoption is unavoidably clear:

  • The cost of healthcare is rising too fast for traditional containment approaches.
  • Even with healthcare reforms, too many people cannot afford health insurance and,
  • Federal and State programs cannot absorb the increased cost of the uninsured in their existing aid programs.
  • Hospitalization costs continue to skyrocket
  • While costs are rising, the quality of care is not.

While many debate how to implement HIT, virtually no one debates the need for its adoption throughout the health sector. The current system has been broken for quite some time and is still hemorrhaging billions of tax dollars.

The latest health information technologies hold the promise of a truly transformed future…a future that was previously impossible just a decade ago:

  • EHRs can reduced the costs of information management
    • Current, uniform patient data becomes accessible in real-time wherever the patient is being treated.
    • Duplicate testing becomes minimized
    • Analytics on data-rich patient information yields
      • better informed care decisions
      • improved outcomes
      • lower treatment costs
    • Sophisticated analysis of massive patient databases yields
      • Superior management of drug safety and effectiveness
      • Rapid identification of expensive, less likely to succeed treatments techniques and more

These are just a few areas of improved care and reduced cost.

It’s not a question of do we adopt state of the art HIT but one of how aggressively we pursue deployment now.

Faculty-Researchers at Harvard Medical School (HMS) practicing at Brigham & Women’s Hospital (BWH) Division of Pharmacoepidemiology and Pharmacoeconomics have Netezza’s TwinFin™ data warehouse appliance as their platform for advanced analytics. Their choice of this technology is especially important at a time when many other stakeholders in drug safety and effectiveness (DSE) are planning to upgrade technology. Harvard and BWH have been leaders in pharmacoepidemiological & pharmacoeconomic research since the 1990’s.  The lab chief, Dr. Jerry Avorn, is the well-known author of “Powerful Medicines: The Benefits, Risks and Costs of Prescription Drugs”.

Dr. Sebastian Schneeweiss, is the Director for Drug Evaluation and Outcomes Research and Vice Chief of the Division of Pharmacoepidemiology and Pharmacoeconomics at Brigham and Women’s Hospital. The Harvard team of researchers is considered to be an industry bellwether:  Here are some of the needs evaluated by the technical lead, Dr. Jeremy Rassen and the sophisticated data mining faculty:

  • Computationally intense rapid analysis of claims data—and in the future E.H.R. data—that keeps pace with expanding data input
  • Capabilities for in-database analytics
  • Ability for accelerated testing of new algorithms
  • A system that facilitates automation of continuous drug safety and effectiveness monitoring
  • Simplicity of use that minimizes the often bottlenecking need for IT support and database administration

Simplicity of use is especially critical since other technologies often require significant set up and technical support time, both of which can seriously delay the outflow of much needed DSE information to groups involved in Health Economics and Outcomes Research (HEOR), Pharmacovigilance, and Epidemiology.

Dr. Schneeweiss is particularly interested in:

“…comparative safety and effectiveness of pharmaceuticals and biotech products, drug policy and risk management program evaluation, and epidemiologic methods using electronic healthcare databases.”

As such, he expects the use of Netezza technology will help expedite the delivery of timely DSE data and ultimately enhance the ability of care providers to act more quickly and effectively on behalf of patients.

We at Netezza are excited that our collaboration with these notable HMS faculty/researchers has already led to leveraging IBM research development efforts and existing products toward revolutionizing computational pharmacoepidemiology. Advanced research tools for pharmacoepidemiology carry with them the prospect of improved Drug Safety and Effectiveness on a global scale.

Outcomes Analysis is nothing new to healthcare payers. Like any other business, health insurers need to know whether their expenditures yield the best possible results for the health of policy holders and the ‘health’ of their ongoing corporate enterprise. Unlike other businesses however, health payers are operating in the volatile reform-focused arena of medical care. Treatment decisions are quite naturally not in their hands but in the hands of physicians whose diagnoses and applied responses to symptoms still vary dramatically. In addition, payers will now have to deal with new federal guidelines governing medical loss ratios (MLR):

CMS is supposed to work with state insurance regulators to make sure that insurers spend at least 85% of the premiums collected from large groups and at least 80% of the premiums collected from small groups and individual insured’s on medical costs.”

 

Last year’s healthcare reform bill (ACA) included the formation of the Patient Centered Outcomes Research Institute (PCORI). The new institute has a lofty mission:

“To assist decision makers by advancing the quality and relevance of evidence concerning the manner in which diseases, disorders, and other health conditions can effectively and appropriately be prevented, diagnosed, treated, monitored, and managed through research and evidence synthesis that considers variations in patient subpopulations, and the dissemination of research findings with respect to the relative health outcomes, clinical effectiveness, and appropriateness of the medical treatments, services, and other items”

 

Dr. Scott Ramsey of the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) points out that only very large organizations such as Kaiser Permanente & Geisinger Health can and have made the significant investments required to carry out the sort of in depth, IT intensive outcomes analysis needed to match the mission. Indeed, for the majority of payers some method of public/private funding to carry out outcomes research will be essential and is built in to the plan:

“The law stipulates that PCORI will be funded initially through federal subsidy and over time by a mixture of federal funds and a tax on public and private health insurers. By 2014, available funds will be to equal $150 million plus an annual $2 fee per Medicare beneficiary transferred from the Medicare Trust Fund, and an annual $2 fee per-covered-life assessed on private health plans adjusted for health expenditure inflation…”

 

We know from an increasing number of public & private research partnerships (such as those between pharma and universities) that improved patient treatment outcomes are already emerging. But are there financial benefits to be gained by payers who pursue outcome studies? After all, private healthcare insurance must remain a profitable business if it is to remain at all.

The answer is yes. Performing outcomes analysis to find providers with the most effective treatments is the best incentive for payers to migrate their insured clients to those providers. Doing so bolsters economic assurance that payer MLR’s will meet the guidelines set by the ACA while simultaneously delivering the most successful patient care. Higher rates of treatment success ultimately emerge in the form of reduced payer expenses.  The mechanism for realizing such benefits rests in the application of the broad range of healthcare IT software and services that are gradually transforming virtually every aspect of delivering medical care.

We are in the midst of a complex merger…a merger which revolves around the dramatic improvements of IT and analytics. These advances are being applied with increased intensity from drug safety and effectiveness to more personalized medicine using EHR’s…from pharmacoeconomics to fraudulent claims discovery and in every other sector of healthcare performance.

Whether seen from the perspective of the patient, the physician/provider or the payer, superior treatment results that emerge from diverse, steadily pursued outcomes research can only result in benefits for all aspects of the healthcare sector.

The expectation of a reformed healthcare system driven by new technology is firmly established.  The recently released IT Industry Business Confidence Index cites advances in healthcare as a key force behind the industry’s optimism.

The foundation for realizing IT based reform is the adoption of EHR’s. The CMS has an incentive program in place motivating care-givers to adopt certified EHR technologies.  The CMS is by far the nations’ largest payer, distributing approximately $800 billion in benefits.

Dr. David Blumenthal, the outgoing National Coordinator for Health IT, says we have now “officially entered the age of ‘meaningful use”.  As of Feb 8th some 18,000 providers have registered to apply for the incentive program aimed at achieving ‘meaningful use’ status. 1,000’s more are expected to register throughout the year.  Meeting the meaningful use criteria is a significant challenge with some critics claiming the process is too complicated and others finding ambiguity in its directives.

According to the HHS, “meaningful use means providers need to show they’re using certified EHR technology in ways that can be measured significantly in quality and in quantity.”  In stage one (2011 & 2012) this means establishing a baseline for data capture and information sharing. The HHS is reaching out to providers to inform them of the detailed steps.

State of the art healthcare IT, like all technologies capable of bringing about revolutionary change, requires much more than the simple acquisition of a new IT system. Those already committed to the journey, such as CMIO Dr. Joel Berman of Concord Hospital (NH) explains:

“Other essential factors include unwavering senior administrative support, engaged clinical champions, dedicated physician and nurse informaticists, effective change management, familiarity with lean principles and practices, enlistment of patients, and commitment to rapid cycle improvement tools and techniques. Eighty percent of the challenges are about people, processes, psychology, and sociology; only 20 percent are about technology…”

As more and more providers commit to the technology and the process of change that comes with it, IT based healthcare reform gains significant momentum. There is no question that everyone wants the benefits that fully applied healthcare IT can ultimately bring:

  • Improved quality of patient care
  • Superior treatment outcomes
  • Increased efficiencies from enhanced data management
  • Cost reduction throughout the healthcare system
  • Anti-fraud management through advanced analytics

The only question is how rapidly providers will adopt the new healthcare IT and fully commit to achieving meaningful use. The indicators so far in 2011 point to steadily increasing commitments.

2010 was a lean year for new drug approvals. The FDA granted permission for only 21 new medications…below both the 2009 and 2008 levels. Some of the most eagerly awaited approvals have been delayed.  Many see the low numbers as a trend

Courtesy of the WSJ

reflecting an increased focus on safety concerns but the FDA states there has been “no systemic change…” in the process of granting approval. Nonetheless, change is most certainly underway when it comes to evaluating safety & effectiveness for proposed and existing medications. 2010  also saw a significant number of drug recalls by the FDA and the agency used its enhanced authority to address issues of labeling and advertising throughout the year as well.

Armed with more legal power through the passage of the FDA Amendments Act (FDAAA) and moving towards improved data management technologies, the agency is targeting changes aimed at improving the drug approval process, responses to adverse events and other key safety segments with its Sentinel Initiative.

The FDA’s Deputy Commissioner Dr. Joshua Sharfstein, (a strong advocate of more stringent safety regulations, who has just recently announced his departure from the Agency) gave a significant update to the House Subcommittee on Health in Spring 2010.

“…FDAAA requires the HHS Secretary to develop methods to obtain access to disparate data sources and to establish a postmarket risk identification and analysis system to link and analyze health care data from multiple sources. On May 22, 2008, FDA launched the Sentinel Initiative with the ultimate goal of creating and implementing the Sentinel System—a national, integrated, electronic system for monitoring medical product safety. The Sentinel System…will enable FDA to actively gather information about the postmarket safety and performance of its regulated products—a significant step forward from our current, primarily passive safety surveillance systems.   The law sets a goal of access to data from 25 million patients by July 1, 2010, and 100 million patients by July 1, 2012…”

Lofty but achievable goals when the right data management technologies are brought to the task. The challenges are significant but with almost half of the population taking at least one prescription drug the value of transforming the FDA from an agency that reacts to adverse events to one that proactively prevents them cannot be overrated.

A December interview in The Economist with IBM’s Global Director of Healthcare Marketing, Andrea Cotter, highlighted the pivotal role of EHR’s in transforming healthcare. This comes simultaneous with a study this month showing better financial performance for providers using EHRs

 

Ms. Cotter explains: “…access to critical health information must be simplified, streamlined and automated to reduce costs and improve service. Electronic health records are the foundationof this transformation, the basic building blocks of health-care efficiency. When standardised and shared, EHRs provide a powerful means of increasing accuracy and speeding the delivery of patient information to the point of care. They enable stronger collaboration, more complete records and better service. And they serve as the enabler of other health-care IT, such as analytics and predictive modeling…”

 

The current healthcare structure is debilitated by fragmented data residing in uncoordinated and inefficient systems. These systems cannot integrate to assist caregivers in their efforts to deliver improved treatment or more efficient cost management. Instead, these disconnected segments in the present structure can become breeding grounds for fraudulent billing schemes, waste, duplication and error at multiple levels.

The good news is that collecting, storing and analyzing high quality patient data is now a fully achievable task and the crucial process of developing/deploying a unified system for sharing the data is well underway. Connected medicine, an impossibility a decade ago, is now healthcare’s imminent destination. To get there the system needs to:

  • Fully digitize patient data and create the EHR’s
  • Develop a universal vocabulary for data exchange
  • Standardize methods for sharing & protecting the data
  • Apply state of the art analytics for interpreting data

As digitizing of patient data comes together in detailed EHRs and a common language for health information exchange is developed a new and encouraging view of healthcare comes into focus.

More and more significant collaborative efforts are following examples like that of UPMC’s $8 billion global health enterprise and its Center for Connected Medicine. The corrective potential of a fully integrated healthcare system is enormous. With patient-care at its core, connected medicine holds the promise of redefining how treatment is managed, delivered & advanced, all within a cost efficient structure.

It’s been a year of steady and encouraging progress in the fight against fraud and waste in the healthcare sector. The Inspector General’s office reports an expectation of significant total savings in many sectors for fiscal 2010, with a noteworthy $3.8 billion coming from investigative receivables.

 

“…We are particularly encouraged by the success of our partnerships with HHS and the Department of Justice through the Health Care Fraud Prevention and Enforcement Action Team (HEAT),” Inspector General Daniel Levinson said in a news release.

 

Use of advanced predictive analytics in preventing fraud & waste is gaining a prominent role in the battle. At a recent healthcare fraud prevention summit HHS Secretary Kathleen Sebelius and Attorney General Eric Holder announced that the CMS will be acquiring new analytic tools. The CMS is soliciting for “state of the art, fraud-fighting analytic tools to help the agency predict and prevent potentially wasteful, abusive, or fraudulent payments before they occur.” It’s the “before they occur” aspect of the analytics that brings the greatest potential for arresting and reducing the sharply rising costs in the current healthcare system.

The CMS, with new and expanded authority,
Will be able to take anti-fraud action before a claim is paid.

 

“By using new predictive modeling analytic tools we are better able to expand our efforts to save the millions – and possibly billions – of dollars wasted on waste, fraud and abuse.” said CMS Administrator Donald Berwick, M.D.

 

Customizable analytics software & services such as IBM’s Fraud and Abuse Management System is providing the state of the art solutions now needed. ‘FAMS’ is capable of sorting through information on tens of thousands of providers and tens of millions of claims in a matter of minutes…creating suspicion indices using almost 1,000 behavioral patterns in a wide range of specialties. The highly customizable system yields rapid results as the analytic modeling tools reveal potentially fraudulent activity, waste, mis-management of funds and other sources of loss. The software, tools and services needed to combat fraud and plug many other fiscal leaks in our ailing healthcare system are ready for frontline deployment.

The nascent trend of five years ago is rapidly becoming the model of today. More & more pharma research is focused on joining forces with universities. The rationale is simple and brilliant; With the ever escalating costs of R & D and the ‘patent cliff’ fast approaching, the merger of resources is a natural wellspring of mutual benefits.

Big pharma needs new drug discoveries and more cost effective ways of discovering them. 80% of all FDA approved drugs have generic counter-parts, according to the 2010 Kaiser Foundation report on prescription drug trends. Add to this the fast approaching edge of the “patent cliff” (2011-2015) when dozens of brand name drugs go off patent, including six of the ten largest medicines in the U.S. and you get a good idea of the challenges facing the ‘business’ of big pharma. Viagara, Actos, Symbicort, Crestor and Avandia are just a few major brands to face generic competition soon.

Partnering with universities will offer big pharma alternate approaches to their on-going research…access to new and experimental technologies, creative thought processes indigenous to the university environment…more cost efficient continued development of in-license drug candidates…fresh stimulus for stalled projects…the potential of discovering multiple, new applications for existing drugs…all in an arena that could offer new, more rapid research platforms for the discovery and release of better medicines.

In this win-win collaboration, universities will be able to analyze pharma’s extensive and diverse data…and data is what it’s all about in the research and development of new drugs. An example of this trend is Sanofi’s recent announcement to collaborate with Harvard in diabetes and cancer research.  As pharma gleans new & improved information from institutional partners, so too do those institutions gain precious access to pharma’s previously locked treasure chest of health science research.

It’s a natural collaboration, taking place on a global scale. A marriage of necessity expected to bring forth a new generation of blockbuster progeny.

The most populated country in the world is fully engaging the issues of healthcare reform. The cornerstone of the newly emerging system is patient data. The organization, storage and management of health records will ultimately maximize the benefits of patient care and cost efficiency.
The WSJ underscores the importance:  “China’s health-care IT market will see remarkable growth in the next five years, triggered partly by China’s three year” health-care reform program, said Janet Chiew, analyst for research firm IDC. IDC estimates the market will reach $2.4 billion in 2013 and grow at an average 19.9% per year.”

Data storage and analysis capabilities will be further driven as the influx of new medical devices and diagnostic equipment continues to surge ahead. According to one report, China’s overall medical equipment market is expected to double between now and 2015 to reach over $53 billion.  This steady increase of new medical equipment will generate massive amounts of new patient data and a concurrent need for real-time access. It is not inconceivable that patient data could more than double in a decade.

Before the information from new diagnostic equipment can be transformed in to cost saving health-care analytics, the existing systems of paper based patient records must become electronic records. This is the first step for eliminating expensive redundancies and delays in gathering patient data. Database technology and storage solutions from IBM and others are already being deployed throughout China and have been for some time.

In Guangdong province, a group of high volume hospitals are implementing a program called CHAS, or Clinical and Health Records Analytics and Sharing. One such hospital focused on traditional Chinese medicine has more than 10,000 patient visits per day. Deployment of the new health-care analytics technology in this hospital is expected by the year-end.

China’s health-care reform offers data storage and analysis technology its largest opportunity yet for developing and deploying solutions…solutions that will have global impact on reducing costs and improving patient care.

The day when a cost efficient technology for reading and sequencing DNA is available may be closer at hand. Intense research is focused on refining the most promising techniques. While IBM’s DNA Transistor Technology is in the forefront, efforts at Oxford Nanopore, Sandia National Labs and elsewhere are validating the trend.

The goal is to reduce costs to where an individuals’ complete genome can be sequenced for between $100 and $1,000. Once this becomes a reality the impact could be so significant as to create a brand new generation of health care capabilities. While there is no way to predict exactly how fast this technology will become available to the average researcher, Manfred Baier, head of Roche Applied Science maintains an optimistic position:

“We are confident that this powerful technology…will make low-cost whole genome sequencing available to the marketplace   faster than previously thought possible”

The technology involves the creation of nanometer sized holes in silicon based chips and then drawing strands of DNA through them. The key rests with forcing the strand to move through slowly enough for accurate reading and sequencing. In this case researchers have developed a device that utilizes the interaction of discrete charges along the backbone of a DNA molecule with a modulated electric field to trap the DNA in the nanopore. By turning these so named ‘gate voltages’ on and off scientists expect to be able to slow and manipulate the DNA through the nanopore at a readable rate. The effort combines the work of experts in nanofabrication, biology, physics and microelectronics.

A clarifying cross-section of this transistor technology has been simulated on the Blue Gene supercomputer. It shows a single-stranded DNA moving in the midst of (invisible) water molecules through the nanopore.

No matter how long it takes for the technology to become a cost-effective reality, it will be a true game-changer when achieved. While researchers express both optimism and caution on the timing, there is one inevitable result for which keen observers in related fields are preparing. When peoples’ individual genetic codes can be economically deciphered and stored, the amount of data generated will be massive. The consequent demands on data storage, mining and analytics will in turn generate their own new challenges.

Using the huge influx of new data to make more informed life science decisions is a key, long-range benefit of the current research efforts in sequencing technology. In health science alone revolutionary new approaches are expected to allow:

  • Early detection of genetic predisposition to diseases
  • Customized medicines and treatments.
  • New tools to assess the application of gene therapy
  • The emergence of DNA based personal health care

An equally critical benefit is the potential cost savings expected when sequencing technology, data storage and advanced predictive analytics combine allowing truly preventive medicine to take its place as the new foundation of health care.

Next Page »



Follow

Get every new post delivered to your Inbox.