Health Data Working Group

KONP supports the use of personal health data in research, service planning and the development of new technologies when these are in the public interest and when ethics and human rights are put first. It also believes that the exploitation of our health data for profiteering should be resisted as actively as the privatisation of our NHS.

The Health Data Working Group

KONP’s Data Working Group was set up in 2022. It initially aimed to provide analysis and a range of materials to support campaigning against the privatisation of personal health data. However, over time our focus has broadened in the face of new concerns.

These include:

  • the increasingly intrusive access to patients personal data, whether from patients medical or social care records or wearable devices, and a growing disregard for transparency, privacy and consent.
  • the use of patients’ data to develop algorithms to inform ‘clinical’ decisions, potentially  undermining clinical judgement and patient safety; dumbing down of the workforce; introducing bias and increasing inequalities.
  • the establishment of new partnerships with the private sector, allowing companies access to patients’ data as default in return for so-called rewards. (see here for more details)
  • the unregulated use of AI that may subordinate the rights and interests of patients to the commercial interests of technology companies or the interests of governments in surveillance and social control, and
  • the growing and irreversible dependence of the NHS on Big Tech companies for Cloud and compute services, with financial and environmental implications, both national and global.

We therefore call for:

1. Public investment for the development of publicly owned national digital infrastructure, aimed at storing and managing NHS data that is currently hosted by platforms (cloud computing services) owned by large technology companies.

2. The independent audit of ‘partnerships’ already established between the NHS and tech companies, with assessment of their risks, impact and extraction of wealth, as well as possibilities for discontinuation.

3. A ban on the secondary use of pseudonymised data by other government departments.

4. A ban on the secondary use of NHS patient data by the private sector.

5. No representation of the private sector in bodies responsible for developing policies that impact ‘digital health’ (I.e. digital care programs and technologies concerned with health care delivery and personalised medicine), given the incompatibility of interests.

6. An independent regulator for data and AI that is primarily focused on the rights of citizens and the development of a regulatory framework for artificial intelligence (AI) . This should recognise that AI solutions applied to health are high risk and ensure human review of automated decisions. This framework should include requirements for independent audit to ensure transparency and accountability, plus measures to prevent racial, gender and institutional biases shaping the development of algorithms and the training of AI software.

7. A halt to the use of algorithms to make clinical decisions about ‘value’ and so where, or on whom limited state money should be spent.

8. A guarantee of representative participation of NHS and social care staff, service users and social movements in policies related to digital health. This will require appropriate funding for social participation to reverse the democratic deficit in current debates and to foster open forums, free conferences etc. in addition to critical training in digital health for social movements, councils and health professionals.

9. Public information campaigns about the use, protection and privacy of people’s health data.


Election Update

What plans does Labour have for the NHS and new technologies? A Labour win at the next election raises the chance of a restored NHS. However, much of Labour’s stance on the NHS is discouraging. Here we focus on Labour’s plans for our data and new technologies.

Labour has put a revolution in technology and Artificial Intelligence (AI) at the heart of their NHS mission but much of their plan has already been enacted by the current government and some new ideas may be problematic. Read our full update here


The Working Group analysis so far


How our data is being used

This webinar, recorded during the passage of the Health and Care Bill, covers how the NHS and its Integrated Care Systems have become dependant on patient data (not least for cost control and rationing); how the government intends to make our health data accessible to the private sector; and gives an alternative vision for how data might be used for the public good.

Given the use of our data by Integrated Care Systems (ICS),  the Data Working Group  developed a series of questions to put to ICSs as Freedom of Information requests in order to understand how our data is being treated, and to challenge this if necessary.See short reports on the results of using these questions:
Example 1:  A case study of South East London ICB
Example 2:  Opaque and questionable use of data by North East London ICB

If you want to use/adapt our questions for your own ICS, you can see them here,  and there are suggestions of how to use them here.


Useful resources

  • Want to see the doctor: Prepare to cough up your data first. This article describes how GPs are now using third-party software for appointments, triaging etc, and that, consequently, many patients can only be seen if they hand over their personal data to private companies often owned y Big Tech.
  • For our guide that aims to explain the most commonly used terms in the field of digital and data, along with information about the various organisations dealing with data protection, standards, ethics etc. see Digital and data: Terminology and relevant bodies.
  • AI for Good: Platforms, ethics and public value. These notes are from a webinar discussion, hosted by the Institute for Innovation and the Public Purpose, and concerned with AI development as relevant to the UN’s goals, including those for health. The notes cover discussion of the benefits but also the downsides of these AI technologies and how these might be avoided.
  • Amberhawk A monthly blog on data protection issues and training
  • This unsettling video by the Financial Times shows how Covid 19 exposed the tension between the need for data to ‘track and trace’ and the right to privacy and justice. (Need to insert video from existing webpage)

Other campaign groups