The KONP Health Data Working Group
KONP’s Data Working Group, initially set up in 2022 aims to provide analysis and a range of materials to support campaigning against the privatisation of personal health data.
Over time our focus has broadened in the face of new concerns.
These include:
- the increasingly intrusive access to patients’ personal data, whether from patients’ medical or social care records or wearable devices, and a growing disregard for transparency, privacy and consent.
- the use of patients’ data to develop algorithms to inform ‘clinical’ decisions, potentially undermining clinical judgement and patient safety; dumbing down of the workforce; introducing bias and increasing inequalities.
- the establishment of new partnerships with the private sector, allowing companies access to patients’ data as default in return for so-called rewards. (see here for more details)
- the unregulated use of AI that may subordinate the rights and interests of patients to the commercial interests of technology companies or the interests of governments in surveillance and social control, and
- the growing and irreversible dependence of the NHS on Big Tech companies for Cloud and compute services, with financial and environmental implications, both national and global.
We therefore call for:
1. Public investment for the development of publicly owned national digital infrastructure, aimed at storing and managing NHS data that is currently hosted by platforms (cloud computing services) owned by large technology companies.
2. The independent audit of ‘partnerships’ already established between the NHS and tech companies, with assessment of their risks, impact and extraction of wealth, as well as possibilities for discontinuation.
3. A ban on the secondary use of pseudonymised data by other government departments.
4. A ban on the secondary use of NHS patient data by the private sector.
5. No representation of the private sector in bodies responsible for developing policies that impact ‘digital health’ (I.e. digital care programs and technologies concerned with health care delivery and personalised medicine), given the incompatibility of interests.
6. An independent regulator for data and AI that is primarily focused on the rights of citizens and the development of a regulatory framework for artificial intelligence (AI). This should recognise that AI solutions applied to health are high risk and so ensure human review of automated decisions. This framework should include requirements for independent audit to ensure transparency and accountability, plus measures to prevent racial, gender and institutional biases shaping the development of algorithms and the training of AI software.
7. A halt to the use of algorithms to make clinical decisions about ‘value’ and so where, or on whom, limited state money should be spent.
8. A guarantee of representative participation of NHS and social care staff, service users and social movements in policies related to digital health. This will require appropriate funding for social participation to reverse the democratic deficit in current debates and to foster open forums, free conferences etc. in addition to critical training in digital health for social movements, councils and health professionals.
9. Public information campaigns about the use of people’s health data and rights to protection and privacy.