Shaping healthcare organisations with AI, science, and emotional intelligence

07 August 2024 Consultancy.uk

With extensive experience in large-scale public sector transformation, Mark Jennings, chief solutions and services officer at analytics and innovation agency, Strasys, is an expert in the interplay between human intuition, scientific data, and AI in decision-making. He explains how healthcare organisations can tap into the technology to supercharge their services.

When it comes to AI, tools, accessibility, understanding and media commentary  around the subject have surged in the past few years, despite the principles of AI being established over 60 years ago. Hyperbole aside, AI is now a key part of the strategy consultant’s toolkit, and we recognise it is an enhancement, not an existential threat to the profession.

At Strasys, we use AI alongside human experience and scientific rigour to help health boards make well-informed strategic decisions, by challenging them to think differently. We focus on creating clear narratives from the patterns in their data, and our approach is hinged around three areas: Understand; Structure; and Challenge.

Shaping healthcare organisations with AI, science, and emotional intelligence

Understanding to build the fullest picture

Consultancy requires getting beneath the skin of an organisation, and we employ AI to deliver qualitative and quantitative research as part of our ‘Understanding’ pillar of work. Using a custom GPT for example, we can synthesise large volumes of unstructured data. This might be collating sets of board reports across multiple organisations or meeting minutes, then auditing how often a subject has been discussed and what proportion of time has been spent on it. Whilst this trend data is valuable, it has its limits, such as being unable to report on tone or context.

Quantitively, segmentation underpins much of our work – specifically in a health context, as it allows us to explore groups in useful ways that aid understanding. We use Machine Learning (ML) techniques including K-Means clustering and correlation analysis. These techniques allow for the allocation of large volumes of entities into a set number of groups, according to what they have in common. Within the healthcare environment this could present as examining new starters regardless of their role or reviewing the home environment of patients.

This proved particularly effective in our work with Alder Hey Children’s Hospital Trust, where we used pioneering segmentation approaches. Alongside the Trust’s board, we analysed the needs, motivations and behaviours of patients and staff. The results of this were crucial to developing the Trust’s Vision 2030 strategy.

Structure: Modelling the impact of decisions

Critically, to understand the running of a hospital, a Trust or an Integrated Care System, we need to look at finance data, workforce data and activity data within the organisations. After isolating the levers that have particular influence, we can model the impact of future decisions, such as effects on cash flow, productivity or quality, from different funding or staffing changes.

Post-pandemic, a lot of experienced people left the NHS and many new people were hired. We all know experience counts (proportionately) more than volume in workforce. But it is hard to understand exactly what sort of impact these recent changes have had on productivity until it has been modelled.

AI is normally used for prediction, but only in very specific and narrow constructs with lots of data and very short timescales. AI predictions decay very rapidly in complex systems. This is why AI-driven quant trading, like RenTech’s Medallion Fund, only holds positions for a couple of days at a time, at most.

We’ve tried some predictive AI in interventions – exploring employees at risk of leaving and where interventions can be personalised. However our conclusion is that it is not worth it. Correlations at an individual level between data and decisions are too weak and ultimately not helpful.

We can segment the workforce into logical, explainable groups with recognisable attitudes, needs and behaviours - from that we help organisations design effective interventions. This has far more value than trying to ‘over fit’ predictive AI.

The same applies to trying to predict clinical negligence costs at a single activity level. Even if it’s possible to pinpoint, more sophisticated AI means slower explainability, which is no good for an organisation that needs to drive interventions. You can’t positively affect something you don’t understand the root causes of.

This is why, at Strasys, we believe it is essential to have health experts under our roof, who provide a level of understanding and experience and can contextualise why something is happening.

Challenging clients to think differently

For prediction over meaningful periods of time we use System Dynamics (SD). Famously outlined in Donella Meadows’ pioneering 1972 paper ‘The Limits to Growth’, SD uses data and contextual understanding to create a model that can approximately – though not perfectly – replay a period of history for the modelled organisation or system. It can be modelled on one set of data and tested on another to predict meaningful causation and consequences. It allows us to create an ‘adaptive strategy’ that covers a multi-variant future.

It also means we can find ways for our clients to think differently and look ahead further to a positive legacy, even while it is dealing with urgent challenges in the present moment, such as budget cuts and staff shortages.

Our client Alder Hey found that almost one in three employees were on the move over a typical year, adversely affecting productivity. By combining qualitative and quantitative data with demographics, the workforce could be examined in a different way. We could understand what mattered to team members, rather than being focused on the specifics of job roles. Making decisions based on insights led to a 5% reduction in staff turnover at Alder Hey and the organisation now tops acute hospital Trusts in the North West, achieving a 72% approval rating from staff (versus the national average of 61%).

We know a typical Trust spends at least 60p in the £1 on people costs but less than 2p in the £1 on training and development. It’s not about shaving off a few per cent from a quarterly budget. The key is thinking bigger, creating something better than we have now, something sustainable.

It’s not possible to perfectly replicate a very complex system through modelling alone, but it can be used to identify meaningful causation and consequences.  What we do is fundamentally human and, above all, our output must be explainable. That means we use function-specific AI in the process, but never in the outputs.

As Einstein famously said, it should be “as simple as possible, but not simpler”.