Archive for the ‘Marketing’ Category
2010 was a lean year for new drug approvals. The FDA granted permission for only 21 new medications…below both the 2009 and 2008 levels. Some of the most eagerly awaited approvals have been delayed. Many see the low numbers as a trend
reflecting an increased focus on safety concerns but the FDA states there has been “no systemic change…” in the process of granting approval. Nonetheless, change is most certainly underway when it comes to evaluating safety & effectiveness for proposed and existing medications. 2010 also saw a significant number of drug recalls by the FDA and the agency used its enhanced authority to address issues of labeling and advertising throughout the year as well.
Armed with more legal power through the passage of the FDA Amendments Act (FDAAA) and moving towards improved data management technologies, the agency is targeting changes aimed at improving the drug approval process, responses to adverse events and other key safety segments with its Sentinel Initiative.
The FDA’s Deputy Commissioner Dr. Joshua Sharfstein, (a strong advocate of more stringent safety regulations, who has just recently announced his departure from the Agency) gave a significant update to the House Subcommittee on Health in Spring 2010.
“…FDAAA requires the HHS Secretary to develop methods to obtain access to disparate data sources and to establish a postmarket risk identification and analysis system to link and analyze health care data from multiple sources. On May 22, 2008, FDA launched the Sentinel Initiative with the ultimate goal of creating and implementing the Sentinel System—a national, integrated, electronic system for monitoring medical product safety. The Sentinel System…will enable FDA to actively gather information about the postmarket safety and performance of its regulated products—a significant step forward from our current, primarily passive safety surveillance systems. The law sets a goal of access to data from 25 million patients by July 1, 2010, and 100 million patients by July 1, 2012…”
Lofty but achievable goals when the right data management technologies are brought to the task. The challenges are significant but with almost half of the population taking at least one prescription drug the value of transforming the FDA from an agency that reacts to adverse events to one that proactively prevents them cannot be overrated.
In the Introduction of Tom Peters’ new business book The Little Big Things, he mentions, perhaps unwittingly, a trend in what management consultants were focused on by decade. He writes about being passionate on a number of things, including“…scintillating customer service (I pretty much had that “space” all to myself in the mid-1980s—believe it or not—“everybody” was doing quality, I was doing service)…”. This tiny mention speaks to a macro trend that drives an argument that our businesses will have a huge focus on customer analytics in the coming decade. Where businesses have focused on product analytics in the past, tomorrow’s analytics will predominately focus on customer.
Imagine the following assumptions. First, what if management consultants were a leading indicator or even a causal factor into how managers decided to analyze their businesses. (N.B. One can even ignore causality and say ‘It doesn’t matter if widely published and listened to consultants and visionaries talking about a concept drives use of the concept, or whether the concept is right and its wide usage drives consultants to talk about it.’ ) The fact is Peters marked a trend—the topic of the 80’s was quality for the most part. Perhaps the mass of consultants talking quality—a product attribute—drove companies to develop analytics around product…but this implementation took ten years. Thus the 1990s was the decade of quality analytics. Fast forward to the 2000s. The 2000s were the decade of management consultants talking customers. Which means the 2010s will be the decade of customer analytics.
As an aside, if you want to know what analytics will occur in the 2020s for most companies, they’ll be about interactions. Today, innovation has just begun in collecting huge amounts of interaction and social networking data. The bleeding edge ‘interactions companies’ (which happen to mostly be Internet companies as they happen to have easiest access to a paradigm to drive the most interactions per second and have recordable data on the interactions) are just starting to think about how to analyze interactions as a marker of the health of their business and offerings. It follows that once the ‘interactions companies’ work out best practices by the middle of the 2010s and they start to seep into leading edge companies, by the 2020s you’ll see early and late majority companies investing heavily into recording and analyzing interactions. These interactions will be employee-to-employee, management-to-employee, customer-employee, customer-customer, influencer-customer, and on and on. More ought to be considered on this topic. And one should not forget that if you translate Tom Peters’ passion for service as a proxy for interaction, you can see he’s about 20 years ahead of his peers….
Back to customer analytics, n interesting piece of anecdotal evidence showing the rise of focus in customer analytics follows. Performing a search on Amazon.com for books with the subject of ‘analytics’ shows a steady growth of books by decade: 2,400 come up published in the 1980s; 4,000 come up published in the 1990s; and 5,500 come up as published in the 2000s–a steady rise. However, change the search term to ‘customer analytics’ and it brings back 0 results for the 1980s, 1 result for the 1990s, and 109 results for the 2000s. The curve of acceleration for books mentioning or about customer analytics is exponential.
Assuming the focus on customers of the 2000s will lead to customer analytics in the 2010s, what does this mean for the workers, managers and executives focused on using, investing in, or building these analytics? It means
a) There’s a good chance you’ve had some exposure to customer analytics, although it’s likely failed, given that the majority of business intelligence projects fail—unless they make use of a purpose-built data warehouse appliance,
b) There’s a good chance the more dashboards and leading indicators you have been making include customer as a part of a key performance indicator, measure, or attribute,
c) The momentum toward predictive analytics mixed with this emerging focus on customers should be driving a rise in interest in behavioral economics…which it is: meaning you’ll be exposed to more service offerings and interest in predicting customer behavior,
d) Your analytic technology will need to grow and perform with much more data. It’s a rare company that has fewer customers than products and stores. The business intelligence technologies of the past were about analyzing combinations of products and stores. It’s typical that the number of customer greatly surpasses this combination.
There are many other events, trends, and factors that emerge from this focus that will happen around customer analytics. The chief drivers to success will be to embrace it and be excellent at it, which means one’s method of collection, strategy, and most importantly delivery will be paramount.
Neural marketing or neuromarketing is an emerging practice of scanning the brain of focus group members as they watch advertisements or images and slogans to understand their propensity to buy. Without going further, what you imagine in your mind’s eye (no pun intended) is what happens: a bunch of folks hooked up to electrodes watching movies on a big screen with some white-lab-coated scientists scurrying around (and likely some marketers watching behind a two-way mirror). While the concept gained ground in the early 2000’s, neural marketing is hitting the mainstream with the publication of the first handbook on the topic, Buyology by Martin Lindstrom. While the book has received some poor reviews, the first mainstream book of a new movement or technique is not expected to be especially good—others like The One to One Future by Peppers and Rogers or Clicking by Faith Popcorn weren’t accused of being perfectly written–we all can’t be Gladwell or Vollmann. Whitepapers and blogs abound, but the real story here is what happens in the future. Today, neural marketing is expensive. And the data is very hard to collect, manage and analyze. A full brain scan for a few seconds might represent 10 gigabytes of data. One could imagine collecting a terabyte of data from one subject for a pair of single television commercials. Then to get an n of 20 for both the control and experimental groups would represent 40 terabytes of data. For a single experiment. Fortunately, innovation and price decreases continue for systems that can handle the loading throughput, storage and mining of data of this size—it was previously cost prohibitive with typical database and hardware configurations. However, it will be some time before this is available at any kind of non-stellar cost. One problem is that medicine is driving deeper into specificity and speed—which creates more data. Older techniques like echo-planar imaging sequences led us down to microsecond recording times of tiny micron-length parts of the brain. They have one subject at a time (the patient), and want to take a single picture and review it. As you can imagine, any computer wants to print out the image and immediately get rid of that enormous data file. Marketers, on the other hand, want to do something more akin to taking a video–think sleep lab. Marketing, on the other hand, wants less data, more aggregation, and most importantly, an objective way to score the scan numerically rather than visual comparison that comes out of the medical tradition.
So is this going to be real? Nestle was reported in 2008 to be noodling around a step away from the brain on its way to replicating the nose and brain of its taste testers. And Microsoft thinks so—they patented an approach to measuring reactions to graphical user interfaces from focus group members.