Medicines Aren’t Perfect. Why Expect Digital Tools to Be?

Mon 16 February 2026
Digital Health
Interview

Liz Ashall-Payne, founder and chief executive of Orcha, says digital health has plenty of new tools and ideas but still lacks trust and real integration into care. For her, the future depends less on more apps or AI hype and more on systems that are personal, transparent, and embedded in healthcare.

Digital health, such as health apps, still feels more complex to use than everyday consumer technology. Health coaches give answers that are repeatable, not adjusted to an individual’s needs, or boring. How can we change it?

We change it by making digital health feel as if it were built for me, not for everyone. That starts with being clear about who a product is actually for. An 80-year-old vs a confident 15-year-old, both with anxiety, need completely different experiences. When apps try to serve everyone, they become generic.

Personalisation has to be real. Advice should respond to someone’s goals, confidence, and day-to-day reality rather than repeating the same scripted answers. AI can help, but only if it’s grounded in a clear understanding of the person using it. When we design around a specific human-being, digital health stops feeling clinical and starts feeling genuinely helpful.

When digital health started in the 80s and 90s, there were no standards or regulations. It turned into data chaos and a lack of interoperability. We now have more standards and regulations, but the digital transformation in healthcare remains incomplete. Why?

Because standards alone aren’t enough. A standard without assurance and distribution is largely meaningless. Today, a huge effort goes into debating which standards are “right”. But once they exist, responsibility often falls through the cracks: who tests products, reassesses them as they change, and ensures trusted solutions reach the right people?

Research consistently shows that patients and clinicians don’t want to interpret technical standards. They want something simple they can trust, a clear badge or label. Delivering that requires a system, not just a standard: one that applies it, assures it over time, and connects trusted products to the market.

We need to move from standards to systems.

You have repeatedly stated that there is no digital health without trust. But no digital system is perfect, and there will be issues such as data leaks or errors. How do we design trust and safety to accelerate innovation?

This is such an important question because fear of risk is often what slows adoption. Health systems sometimes seem to expect digital health to be perfect, but nothing ever is. We don’t expect perfection from medicines. Every drug comes with clear information about side effects, risks, and who it’s suitable for. Digital health needs the same mindset.

The real question isn’t “is this perfect for everyone?” It’s “who is this evidenced for and who isn’t it for?” If a product doesn’t work well for someone who is visually impaired or deaf, that doesn’t make it bad. It means it’s not designed for that user. One size doesn’t fit all.

Being open about limitations and risks builds trust. It provides people with the information they need to make informed decisions. That kind of transparency doesn’t slow innovation; it enables it to move forward with confidence.

From your global experience, where do health systems most often get digital health wrong?

I can repeat what I mentioned already: They focus on the standard, not the system. There’s often far too much effort put into the stick and not enough into the carrot. A technology might pass a standard or meet regulatory requirements, but then what? Very often, nothing happens. It doesn’t get distributed, recommended, reimbursed, or embedded into care pathways.

If we want digital health to succeed, we need systems that don’t just approve technology but actively help it reach the people who need it. That’s where health systems most often fall down – and where we still have the biggest opportunity to get this right.

What will ultimately decide whether digital health becomes mainstream or stays marginal?

Before we ask whether digital health will go mainstream, we need to define what “mainstream” means. The original vision was full integration into clinical care, I mean, prescribed like medicines, with clear reimbursement and structured pathways. Progress towards that has been slower than many expected.

In response, companies have adapted. Many now operate hybrid models: direct-to-consumer, because people will pay for tools that help them, alongside B2B or B2B2C routes, which are slower but more sustainable. The market is evolving out of necessity.

Ultimately, mainstream adoption comes down to two things: usage and sustainability. If people use digital health tools repeatedly and meaningfully, they become normal. If they sit unopened, they remain marginal.

But even high engagement isn’t enough without a viable revenue or reimbursement model. Real user value, combined with a sustainable commercial pathway. That’s what will decide whether digital health truly becomes mainstream.

We know that health queries constitute over 5% of all daily ChatGPT messages. Generative AI is becoming a health coach for many – and it's not even certified for medical purposes. How do you see the role of genAI in healthcare?

I’m not surprised that so many ChatGPT conversations are about health. When people are worried or confused, they look for answers — and now the answer talks back. Generative AI is filling a gap. It’s instant, accessible, and non-judgmental, which makes it feel like a health coach to many. It can be brilliant at helping people understand information, translate medical language, or prepare for appointments.

But we must be clear. There’s a difference between support and clinical care. Healthcare isn’t just about helpful answers; it’s about safe, evidence-based decisions. When AI moves into personalised medical advice without regulation or assurance, that’s where we need caution.

The future isn’t about replacing clinicians. It’s about reducing friction, helping people navigate the system, and engaging more confidently in their care. If we’re transparent about what these tools can and can’t do, and put the right guardrails in place, generative AI can strengthen healthcare rather than undermine it.

Will all digital health apps have to transition to intuitive, AI-driven coaches that patients expect to use, with usability and UX comparable to ChatGPT, Gemini, or Co-Pilot?

No, I don’t think every digital health app needs to become an AI-driven coach. What people value in tools like ChatGPT is the experience. It’s intuitive, conversational, and easy. Some digital health tools are built to do one thing really well. Adding AI in those cases could create complexity rather than value.

That said, expectations have changed. Digital health needs to feel simple, responsive, and human. If it feels clunky, people won’t use it. The real question is whether it genuinely helps someone in a way that feels natural and effortless. If it does that, it will succeed, with or without a chatbot.

It's 2030, and you are to analyse what went wrong in digital health from 2020 to 2025. What would it be?

Looking back from 2030, the problem wasn’t a lack of innovation. Between 2020 and 2025, innovation was everywhere. What went wrong was our handling of it.

First, we confused emergency adoption with long-term transformation. During COVID, digital tools were rolled out at speed. But when the pressure eased, many systems slipped back into old models. We piloted constantly, but didn’t embed digital into pathways, reimbursement, or workforce design.

Second, we focused more on shiny technology than on infrastructure. AI, apps, and platforms surged ahead, but interoperability, workflow integration, and procurement models lagged behind. Technology can’t scale if it sits outside the system.

Third, we debated standards and regulations but didn’t build the assurance and distribution systems needed for real adoption. Approval didn’t equal uptake.

Finally, we overlooked sustainable business models and behaviour change. Great products without reimbursement didn’t survive. And digital health is as much about cultural change as technical change. The issue wasn’t that digital health didn’t work. It’s that we didn’t build the ecosystem around it quickly enough.

Watch Liz Ashall-Payne’s keynote at the ICT&health World Conference 2026 here.