Joanna Fox MSc
NAPC Council & Faculty Member l Digital Health & Human Factors Lead
Prof Andy Brooks
NAPC Clinical Chair
As explored in a recent conversation between Prof Andy Brooks and digital transformation expert Joanna Fox, digital innovation is rapidly reshaping primary care. From virtual triage to AI‑supported decision‑making, the opportunities are significant. But Jo stresses a crucial point: digital tools don’t guarantee safety. Their value depends on how well we understand them, deploy them, and wrap the right human systems around them.
Jo’s Background and Research Focus
Jo’s work over recent years has centred on patient safety through a human‑factors lens and looking at how people, processes, and technology interact across the health ecosystem. Her research highlighted important gaps: many colleagues want to deliver safe, high‑quality care, but the level of awareness, training, and understanding of digital safety frameworks is still inconsistent across primary care settings.
Digital Technology and Human Interaction
Jo emphasises that healthcare is built on human‑to‑human interaction. Digital tools are now deeply embedded in that interaction, but they must support, not disrupt, the relationship. She notes that the ambition to scale digital approaches is clear in national plans, but frontline staff still face big questions about how safe, familiar, and well‑understood these tools really are in practice.
Understanding Frameworks and Standards (e.g., DCB0160)
Digital clinical safety frameworks, such as DCB0129 for suppliers and DCB0160 for care providers, are regulatory requirements under the Health and Social Care Act. Yet Jo’s research found that only 41% of primary care organisations had assigned the required safety role. Training was also limited: just 58% felt they’d received any formal education on digital safety. These gaps raise real questions about capability, governance, and confidence across the system.
Jo also highlights the overwhelm many feel when faced with DPIAs, NICE guidance and consultation updates. The resources exist, but they’re not always easy to navigate. Her recommendation: streamline them so they feel meaningful rather than like a tick‑box exercise.
Simplifying Digital Design and Workflow
Jo acknowledges a familiar frustration: “Why can’t I just switch it on like my phone?” But healthcare is one of the most complex systems in the world. When technology doesn’t fit real workflows, staff quickly revert to workarounds – which introduces risk.
She argues strongly for:
- Simpler, co‑designed tools
- Better supplier engagement
- Testing in real work environments
- Ensuring training matches the diversity of the workforce
The goal is to ensure technology adapts to the way people work, not the other way around.
Preparing for Future Technology and Neighbourhood Care
Neighbourhood care is changing where and how professionals deliver care. With staff working across multiple environments, technology must be flexible enough to match these diverse settings. Jo stresses that digital safety shouldn’t sit with a select few: every role, from social prescribers to community health workers, must feel confident using and questioning the tools they rely on. Safety becomes part of the collective culture, not a separate task.
Challenges with Technology Change
Many people assume that tech is harder to change than human processes, and therefore tolerate poor design or outdated tools. Jo challenges this mindset. She believes practices must feel empowered to question suppliers, push for improvements, and avoid slipping into an “us and them” dynamic. Good technology should evolve with the reality of care delivery, not stay frozen at the point of purchase.
Top Safety Risks in Digital Primary Care
Jo highlights three major digital safety risks:
- Weak or absent risk‑management processes.
Tools like hazard identification and functional failure analysis exist but aren’t widely used. - Insufficient real‑world testing.
Walking the pathway and simulating scenarios helps uncover issues before go‑live. - Excluding the patient voice.
If patients interact with the technology, they must be part of co‑design and testing.
These risks will intensify as AI and large language models become part of routine care. Jo raises important questions about over‑reliance on AI and the balance between upskilling and de‑skilling the workforce.
Learning from Digital Incidents
Digital incidents are still often funnelled into IT service desks rather than clinical safety reporting. Jo sees this as a missed opportunity for learning. She suggests establishing clearer taxonomies, aligning digital issues with clinical risk processes, and sharing learning beyond local teams to build a national picture. Reporting should lead to improvement, not sit in isolation.
Closing Reflections
Jo concludes with a reminder that digital transformation is ultimately about improving patient care. Confidence, literacy, good governance, and shared learning are just as important as the tools themselves. As AI grows more complex and more embedded in clinical practice, the need to strengthen digital clinical safety becomes even more urgent.
Digital technology is a powerful enabler, but only when people, processes, and systems around it are equipped to use it safely and meaningfully.


