Health Data Working Group

KONP supports the use of personal health data in research, service planning and the development of new technologies when these are in the public interest and when ethics and human rights are put first. It also believes that the exploitation of our health data for profiteering should be resisted as actively as the privatisation of our NHS.


NEW
Three new articles on the NHS , ‘Clouds’ and AI

From the Keep Our NHS Public Data group – September 2024

It is now clear that the Labour Party intend to proceed to a ‘Digital NHS’ with all that means in terms of privately provided databases, software, and digital wearables, and Artificial Intelligence (AI) and Big Tech ‘public clouds’. The Data Working Group have three new important papers:

◼︎Part 1 ‘Public clouds’, what they are, how they work and what they mean for the
NHS

Big Tech has cornered the market as owners of  ‘public clouds’ and makes huge profits almost invisibly by: renting out use of  ‘cloud’ ; investing in or buying promising ‘start-up’ companies; taking sole ownership of, and renting out, jointly-produced essential new knowledge; controlling their ‘innovation systems’ in a similar way to franchises; using smaller companies to market their services and work directly with customers; and all the while they make it difficult for users to leave ‘public cloud’….

◼︎Part 2 Global implications of the use of Big Tech and their ‘clouds’
Big Tech existence depends on a series of ongoing ‘global harms’ to maintain its power and influence: ‘data colonialism’; control at the WTO; the exploitation of miners for rare-earth minerals; the use of heavily polluting manufacturing facilities-currently offshored from the US; excessive use of electrical power and water supplies; and the political support provided to them as a result of i) the  economic ‘productivity’ they generate, and ii) the high tech surveillance and weapon systems they help to develop to maintain the military authority of the global North.

◼︎Digitalisation and AI
Digitalisation, Artificial Intelligence (AI) and the NHS workforce
Digitalisation and AI are disrupting the relationship between the NHS and its workforce. For example, AI is allowing increased surveillance of workers, and facilitating ‘flexible working’ in the name of productivity but at the expense of staff privacy and collective bargaining. To protect the rights of workers, we need to support new legislation proposed by the TUC.


The KONP Health Data Working Group

KONP’s Data Working Group was set up in 2022. It initially aimed to provide analysis and a range of materials to support campaigning against the privatisation of personal health data. However, over time our focus has broadened in the face of new concerns.

These include:

  • the increasingly intrusive access to patients personal data, whether from patients medical or social care records or wearable devices, and a growing disregard for transparency, privacy and consent.
  • the use of patients’ data to develop algorithms to inform ‘clinical’ decisions, potentially  undermining clinical judgement and patient safety; dumbing down of the workforce; introducing bias and increasing inequalities.
  • the establishment of new partnerships with the private sector, allowing companies access to patients’ data as default in return for so-called rewards. (see here for more details)
  • the unregulated use of AI that may subordinate the rights and interests of patients to the commercial interests of technology companies or the interests of governments in surveillance and social control, and
  • the growing and irreversible dependence of the NHS on Big Tech companies for Cloud and compute services, with financial and environmental implications, both national and global.

We therefore call for:

1. Public investment for the development of publicly owned national digital infrastructure, aimed at storing and managing NHS data that is currently hosted by platforms (cloud computing services) owned by large technology companies.

2. The independent audit of ‘partnerships’ already established between the NHS and tech companies, with assessment of their risks, impact and extraction of wealth, as well as possibilities for discontinuation.

3. A ban on the secondary use of pseudonymised data by other government departments.

4. A ban on the secondary use of NHS patient data by the private sector.

5. No representation of the private sector in bodies responsible for developing policies that impact ‘digital health’ (I.e. digital care programs and technologies concerned with health care delivery and personalised medicine), given the incompatibility of interests.

6. An independent regulator for data and AI that is primarily focused on the rights of citizens and the development of a regulatory framework for artificial intelligence (AI) . This should recognise that AI solutions applied to health are high risk and ensure human review of automated decisions. This framework should include requirements for independent audit to ensure transparency and accountability, plus measures to prevent racial, gender and institutional biases shaping the development of algorithms and the training of AI software.

7. A halt to the use of algorithms to make clinical decisions about ‘value’ and so where, or on whom limited state money should be spent.

8. A guarantee of representative participation of NHS and social care staff, service users and social movements in policies related to digital health. This will require appropriate funding for social participation to reverse the democratic deficit in current debates and to foster open forums, free conferences etc. in addition to critical training in digital health for social movements, councils and health professionals.

9. Public information campaigns about the use, protection and privacy of people’s health data.


The Working Group analysis so far


How our data is being used

This webinar, recorded during the passage of the Health and Care Bill, covers how the NHS and its Integrated Care Systems have become dependant on patient data (not least for cost control and rationing); how the government intends to make our health data accessible to the private sector; and gives an alternative vision for how data might be used for the public good.

Given the use of our data by Integrated Care Systems (ICS),  the Data Working Group  developed a series of questions to put to ICSs as Freedom of Information requests in order to understand how our data is being treated, and to challenge this if necessary.See short reports on the results of using these questions:
Example 1:  A case study of South East London ICB
Example 2:  Opaque and questionable use of data by North East London ICB

If you want to use/adapt our questions for your own ICS, you can see them here,  and there are suggestions of how to use them here.


Useful resources

  • Want to see the doctor: Prepare to cough up your data first. This article describes how GPs are now using third-party software for appointments, triaging etc, and that, consequently, many patients can only be seen if they hand over their personal data to private companies often owned y Big Tech.
  • For our guide that aims to explain the most commonly used terms in the field of digital and data, along with information about the various organisations dealing with data protection, standards, ethics etc. see Digital and data: Terminology and relevant bodies.
  • AI for Good: Platforms, ethics and public value. These notes are from a webinar discussion, hosted by the Institute for Innovation and the Public Purpose, and concerned with AI development as relevant to the UN’s goals, including those for health. The notes cover discussion of the benefits but also the downsides of these AI technologies and how these might be avoided.
  • Amberhawk A monthly blog on data protection issues and training
  • This unsettling video by the Financial Times shows how Covid 19 exposed the tension between the need for data to ‘track and trace’ and the right to privacy and justice. (Need to insert video from existing webpage)

Other campaign groups