posted on ONITOR health linked in
This week, more than 100k techies are gathering in Las Vegas for the largest consumer electronics show in the world – CES, where they are showcasing every potential angle on how to advance humanity with the use of technology.
This is not as pompous as it may sound. In the last 20 years CES has been the birthplace of most of the things that we use in our lives today – from smartphones to social media, from gaming to autonomous driving. These were first presented in Vegas and then commercialized, improved, failed, reinvented and finally made to fit our life and needs in a way that makes it almost impossible to imagine it without them.
In the last 3 years a lot of talk and focus at CES has gone in the direction of wearables – computers that you can wear and that offer personalized experiences based on bodily sensors. The meteoric rise of these devices was made possible by a combination of trends, mostly technological, but also social influences, and some very clever companies who were able to capitalise on them.
Firstly, and not surprisingly, the abundance of computing power in everyone’s hands, in the form of smartphones, took away the need to add expensive hardware to the wearables. It was a classic hub & peripherals design and all you needed to do was to send raw data to the phone and then have a slick app that did all the sticky user-interface. No wonder that most early wearables did not have a screen and were a little more than a piece of plastic with an accelerometer and a Bluetooth chip.
Then there was the availability of cloud services and cheap connectivity to them which removed the need for heavy computation on the phone – helping with power and memory issues but also, and more importantly, moved the collected data to the cloud and from, literally, the hands of the user into the hands of the companies who developed the devices. This allowed algorithms to become more accurate but also more sticky (if you switched your device you would lose all the data that you collected). It also allowed companies to introduce premium services at extra charge.
I mentioned that there was also a social aspect playing a part here – in the past, when people discussed wearables and bodily sensors the immediate concern was data privacy. There has never been more personal data collected by electronics companies, not to mention stored on their cloud servers, and many experts considered that to be a tough line for consumers to cross. However, as it turns out, and notwithstanding ongoing privacy concerns, people were not too worried about sharing their daily activities, training scores, heart rate and even sleep patterns with others. In fact, sharing data with the broader community made for a business model for some companies like Strava. I’m not arguing that data privacy is no longer a serious challenge or that the generational attitude towards (the lack of) it is a good thing. I’m just pointing out that without that shift in people’s sensitivities wearables would not be where they are today.
The obvious question about wearables nowadays is – what next? We’ve seen the hype in activity trackers, followed by an Apple-led hype in smart-watches, but both of these seem to now hit the disillusionment, consolidation phase, where companies either exit the market (Nike, Microsoft) or getting hit hard in the stock market (e.g. fitbit). One thing is clear to me – this is by no means the demise of the wearable industry. We have only scratched the surface in terms of what these technologies can do, and so far we haven’t found a killer-app for them (sorry – keeping track of your steps / sleep / activity levels are all ‘nice to haves’ and not must-haves, hence why customers stop using them after a few weeks).
I believe that the next frontier and perhaps the biggest commercial opportunity for wearables is in the healthcare space – a multi $ billion industry that hasn’t really changed its business model in the last 50 years. In fact, even the name of the industry is misleading – in most societies when we say healthcare we really mean sick-care, i.e. the system is mostly designed in a way that as long as you’re healthy you barely interact with it. If and when you’re sick or injured you are medically treated by a clinician whose job it is to ‘fix’ you. And once you’re ‘fixed’ you go back to square one, in which you don’t interact with the ‘healthcare’ system. The other part of this archaic business model is the commercial, compensation model. Clearly, doctors and institutions need to get paid for their effort and expenses right? So let’s just quantify every action taken by them in terms of effort and cost, mark it up by a margin and identify who would pick up the bill. Well, two problems with that – first, no matter how hard you try to identify payers for medical costs it always comes back to one pocket – the public. Whether it’s through government taxes that fund a single payer (e.g. NHS in the UK) or through employer private insurance policies (US) which in turn translate to stagnating net salaries. Second, in this financial model there is very little mention of healthcare value – i.e. how much health do you get for your buck. In fact, there are many cases where financial value and health value are misaligned in the current system.
Another problem with the ‘fixing’ model is what happens when the patient can’t be ‘fixed’? Chronic diseases are by definition not curable, leaving two options for the healthcare system – either continuous treatment or prevention. Unfortunately, the financial model was not built to cater for either of these. Continuous treatment is extremely expensive and increases the burden on the overall public purse and prevention methodologies were never quantified in terms of health value so there is no compensation model established for that.
To top it all up, we go back to the social element – western society today is facing major health risks that are lifestyle related and that lead to, while longer, a life that is full of chronic sickness for many years. In fact, it is said that for every year that we add to our lifespan, two thirds of it will be lived in sickness. The stats are truly mind boggling: about half of people today will have a type of cancer in their lifetime, 2/5 of Americans are expected to develop type-2 diabetes, cardio vascular diseases are by far the biggest killer today and in western countries more people die of obesity than of malnutrition.
To me, the future of healthcare needs to be built on 4 pillars:
1. Health value rather than financial value. We have to find a way to quantify health benefits and fit the financial model to that. That is how you turn a sick-care system into a healthcare system.
2. Prevention has to become the number one priority for governments as well as individuals. It is cheaper, easier, less painful and generally happier to avoid chronic conditions than to treat them.
3. Responsibility for one’s health has to be shared between the individual and the healthcare system. Lifestyle decisions are being taken every minute of the day and they make the biggest impact to our health prospects. The healthcare system and the government can nudge us in the right direction but ultimately it is a personal responsibility.
4. It’s time for the healthcare system to become truly data driven and personalised. And I don’t mean the clinical data methodologies that the system was formed upon. Clinical trials are effective in developing new drugs but health data exists well beyond the clinical environment and unless it can be incorporated into the clinical setting then we will miss a huge opportunity to use available technologies to help society. Just think: Cloud, Big-Data, IoT, AI… All of these technologies are making huge impact outside of the medical seeing but are bringing very limited value into the hospital.
Looking at all of the pillars above I see wearables as a key enabler to this transformative process. Once you can monitor continuously, unobtrusively and cheaply the physical condition of every person wherever they are you are:
effectively allowing healthcare providers to get continous data from outside of the clinical environment and make better decisions based on that.
placing the power, and the responsibility for one’s long term wellbeing in that person’s hands.
giving the healthcare system a intervention tool that can be prescribed and integrated into patients’ lives and help prevent chronic diseases.
enabling an analysis and understanding of where most health value can be achieved which in turn allows an allocation of resources accordingly.
Before I finish, I still owe an explanation on the connection to Donald Trump, President Elect, in all this (apart from being a useful teaser headline) – having watched the recent political events in the US and, in fact, the UK, I kept feeling that whichever way things were going to go this was the clearest demonstration of democracy in modern society. For better or worst, people who felt disenfranchised by the existing political and economical system were put in a situation where their vote (the power of 1) was exactly the same as that of the most powerful person. There is something very liberating and empowering in that situation and that, I believe, is a good thing for society in the long run.
I believe that the future of healthcare is about democratizing healthcare and focusing again on the power of 1. Empowering people to live healthier is much more like giving them a fishing rod than like giving them fish. Whether they use this power for good cause is really up to them.