Skip to main content

This blog post was published under the 2015-2024 Conservative Administration

https://healthtech.blog.gov.uk/2019/03/21/all-the-things-we-like-about-for-the-cdei-strategy/

All the things we like about the Centre for Data Ethics and Innovation’s new strategy

Posted by: , Posted on: - Categories: Collaboration, Data, Ethics, Healthtech innovation
A networked brain
Image credit @KSSAHSN

The Centre for Data Ethics and Innovation (CDEI) is an independent, expert-led body set up by the UK Government to advise on how we “maximise the benefits of data and AI for our society and economy.”  They’ve just launched their first 2-year strategy.

We at NHSX think it’s really good.

What is the strategy?

The aim of the strategy is “to help the UK become a global leader in responsible innovation in data-driven technology that benefits society as a whole”. This is not an easy ask but that’s what makes it appealing.  Being bold and ambitious makes us more likely to be open to challenge, new ideas and new ways of thinking.

Governance as an opportunity, not a barrier

The strategy states that the  Centre for Data Ethics and Innovation will “work to ensure governance of data-driven technology can safely support rapid developments in the technologies and their applications.”  

This  emphasis on supporting the deployment of new tech is really important. Done well, good governance supports rather than stifles innovation. Over time it might be necessary to add new layers of governance, but this needs to be done in a way that’s proportionate to risk. CDEI’s approach means that what ‘right’ looks like will be designed in a way that is  rigorous, agile and resilient to the changing technology market.

Doing good rather than just avoiding harm

Most of the time when the governance of technology is discussed, it’s purely about avoiding harm. This is fine, but we can do better. This commitment to the doing of good rather than just avoiding harm rings through clearly in the Centre’s strategy where it says that it will “seek to build a policy and governance environment that enables data-driven technology to improve people’s lives.”

Just like it says in our Tech Vision, it’s not just about getting the current systems to work better, our ambition is to make the best technology available for the NHS and social care sector. As Secretary of State, Matt Hancock, said at the beginning of this week: “We care about the best technology in healthcare, because we believe in the power of technology to make human lives better. To care about technology is to care about people.”

Keeping society involved, not just informed

Most importantly, CDEI states that it will “ensure the public’s views form the governance of data-driven technology,” that it will work with members of the public to understand their priorities, hopes and fears around the use and development of data-driven technology, to give everyone a say.

This frames people and their preferences as part of the solution, giving them ownership over the blueprint and keeping them actively involved from the beginning, rather than just telling them what is going on, in other words giving them a real chance to feed into and change the solution.

It all demonstrates respect for people and for their autonomy and we at NHSX couldn’t support it more. It’s why the X in NHSX is for user experience, why we have user need and inclusion as two of our guiding principles in the Tech Vision, and why we’re working hard to ensure that our Code of conduct for data-driven health and care technology is a vehicle for cutting through the AI hype and focusing on delivering collectively agreed outcomes in a way that has very clearly taken into account the potential implications (good and bad) before the technology solution is launched.

But wait, the strategy doesn’t mention health, social care or the NHS?

It is true that the CDEI strategy will initially focus on identifying algorithmic bias in crime and justice, financial services, recruitment and local government, rather than health, social care or the NHS. However, there are two reasons why this matters to health:

  1. Health is not confined to the realm of medicine. Anybody who has ever been treated unfairly at work, been denied access to the services they need or got rejected from a job they thought they really deserved, will know that these things can make you feel just as bad as getting physically sick. Protecting people from algorithmic harms in crime and justice, financial services etc. will protect their health too.
  2. We are firm believers in the importance and power of open standards. There are currently no agreed standard definitions of Fairness, Accountability or Transparency. This makes it hard to agree what good looks like for ‘ethical machine learning’ in healthcare or in any other sector. The Centre for Data Ethics and Innovation is led by world-leaders in this field and we are sure that the standards and guidance they produce, through their study of algorithmic bias in their chosen fields, will unearth best practice that we will want to learn from, not compete with.

It's why the final point in the strategy we want to applaud is the commitment to collaboration and to the fostering of effective partnerships between civil society, government, academia and industry.  If we want to achieve a mutually agreed conclusion, we all need to be speaking the same language and (shocking to say in the context of tech, I know) we probably all need to get together and talk it through.

Sharing and comments

Share this page

1 comment

  1. Comment by Dr Ben Rusholme posted on

    Completely agree that algorithmic bias in crime and justice, financial services, recruitment and local government impact upon health.

    Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.