Human Rights in Digital Health: AI for Good?

Human Rights in Digital Health: AI for Good?

[Sara (Meg) Davis is Professor of Digital Health and Rights at University of Warwick and an Associated Researcher at the Geneva Graduate Institute.]

In drafting her report to the UN Human Rights Council, the Special Rapporteur drew in part on research conducted by a consultant, Patty Skuster, who the author supervised at the Geneva Graduate institute. 

At a critical moment in the global digital transformation, with media and world leaders debating whether artificial intelligence (AI) might pose a future existential threat to humans, are we talking enough about the harms already being done in the present? A new thematic report to the UN Human Rights Council from the UN Special Rapporteur on the Right of Everyone to the Highest Attainable Standard of Physical and Mental Health, Dr. Tlaleng Mofokeng, highlights the real-world threats we face in the ongoing spread of digital technologies in global health, some real benefits, and the urgent need for a broader public conversation about human rights and health in the digital age.  

In the urgency created by spikes in deaths and illness, the pandemic led to a rapid spread and growth of technological solutions for health services: from AI-driven remote diagnosis of COVID-19; to remote medical consultations in place of shuttered clinics; to the widespread deployment of rapidly-improvised mobile phone apps that screened for vaccines and monitored movements of those under quarantine. Many states jettisoned rights protections, using wartime metaphors as justification for expansive surveillance. 

The digital turn is also actively promoted through the World Health Organization’s Global Strategy on Digital Health 2020-2025 and by many bilateral and philanthropic donors to global health programs. However, government officials have struggled to keep up with and comprehend — let alone effectively govern – these proliferating new tools, the data they consume and produce, or the algorithms that drive them. Gaping inequities are appearing in the digital transformation. In framing these issues in the context of the right to health, he Special Rapporteur’s new report takes an important step towards establishing a normative framework for respecting, protecting and fulfilling the right to health in the digital age.

There have been numerous UN human rights reports and resolutions on technology, including several emphasizing the right to privacy. The new report from Dr. Tlaleng does not ignore privacy, but considers it as one of several equally important issues. In keeping with her mandate, the Special Rapporteur’s report considers the factors that comprise the right to the highest attainable standard of health by the Committee on Economic, Social and Cultural Rights (CESCR): Availability, accessibility, acceptability and quality (AAAQ) of facilities, goods and services. The report also considers the right to sexual and reproductive health, and the risk of perpetuating racism, sexism, ableism or discrimination based on sexual orientation or gender identity; and it clarifies legal obligations under the right to health for both states and private actors.

Quality

New technologies offer the potential to dramatically reshape quality of health services for good. For example, through the ability to absorb and analyze large quantitites of data, AI offers transformative possibilities for tasks from the diagnosis of malignant tumors, to improved precision in surgery, to predicting population health, among many others. However, AI can only learn from the data that already exists, and it has tended to be hampered by significant gaps in data on women and minoritized populations, as well as by faulty assumptions in designers. Cost-effectiveness software used in the HIV response to aid in decision-making and priority-setting has already been troubled by the gaps in data on stigmatized and criminalized key populations who are left uncounted in official data on health, for example. 

New advances in quality control are also needed to ensure that new technologies have sufficient scientific evidence of positive impact before they are rolled out. The COVID-19 fever of digital contact tracing apps is a case in point: early in the pandemic, states embraced the hope that digital contact tracing would control or even end the pandemic, whatever the potential harms to right to privacy, data protection and public trust. 

Nonetheless, COVID-19 contact tracing apps were developed and launched in many countries at an unprecedented speed, in new public-private partnerships, often without adequate governance, coordination or support. Sean McDonald warned this attempt to substitute slapdash tech solutionism for long-term investment in health systems could turn out to be technology theatre. In the end, manual contact tracing by healthcare workers may have been more effective than the digital variety — while the contact tracing apps often languished with low rates of public uptake. As a physician herself, Dr. Tlaleng rightly underscores the importance of human contact in primary care in her report. 

We may never know, at either national or global scales, exactly how much public funds were spent on development and marketing of digital contact tracing apps, let alone the related costs of training and infrastructure. How much of the data gathered by digital contact tracing apps actually went to inform health systems — and how much went to private third parties? Studies have found that even when users opt out of data-sharing by some health apps, the apps may not respect that choice.

Without more robust approaches to ensuring quality of digital technologies and effective protection of health data, there is a real risk that the rapid digital transformation will both lead to epic waste of public funds, and will further fracture the public trust that is critically needed in public health emergencies. 

Inequalities and discrimination in the digital transformation could contribute to this fracturing also.

Availability, Accessibility, Acceptability and Intersectionality

Of the diverse issues addressed by Dr. Tlaleng in the report, one of the most important and urgent relates to non-discrimination and availability, accessibility and acceptability of digital technologies. 

As she notes, every society faces diverse digital divides, with some individuals unable to access health information or services through digital platforms and tools due to lack of education, lower economic status, lack of access to internet, and numerous other barriers. In many cases, these gaps build on existing widespread forms of discrimination and exclusion, including racism, ableism, and discrimination against ethnic or linguistic minorities, among other forms. The report notes that a “digital gender divide” in many countries results in “less access to, use of, and ability to influence, contribute to and benefit from information and communication technologies” for women and girls. It describes numerous other physical, economic and other barriers to inclusion in the digital transformation. 

In some cases, these diverse forms of inequality intersect to deepen exclusion for the most marginalized – who may also be those most in need of health information and services. In Ghana, as part of a multi-country study by the Digital Health and Rights Project, I spoke with a group of female sex workers who described sharing a smartphone with a friend, as they could not afford phones individually; and fearing constantly for the loss of that shared phone, essential for accessing clients, due to the constant threat of police raids. They were also unable to access essential sexual health information and services online through their phones, because the information was in English, and they knew only Ga, a local Ghanaian language. 

The millions of dollars spent on digital health for HIV prevention, treatment and care by international and national agencies in Ghana is unlikely to reach these women and many of their peers unless these specific needs and intersecting barriers are taken into account in planning. With all these challenges, young people most vulnerable to HIV now live on their phones and on social media, and public health needs to go where they are.

Meaningful Participation and a Rights-based Approach

The General Comment on the Right to Health calls on states to analyze the needs of diverse groups and address them in national health plans. However, while WHO has encouraged national governments to develop digital health strategies which are much needed, these have largely been developed by governments working in collaboration with UN agencies, donor agencies, and the private sector — without civil society or community input. Thus, they miss these specific needs and concerns like those of the sex workers I met in Ghana, and are flimsier tools for planning and coordination as a result. 

Dr. Tlaleng’s report calls for a rights-based approach to digital health governance that ensures the meaningful participation of civil society and communities. If this were to happen it would be a major paradigm shift, as the private sector holds almost total domination of digital health governance at the moment.

In future reports on digital health and human rights, the role of the private sector deserves specific and more in-depth attention. As Dr. Tlaleng’s report rightly notes, digital platforms have created new spaces for information-sharing and connection for marginalized groups, and to enable access to sexual and reproductive health information that could otherwise be prohibited or inaccessible in person. However, this is de facto a privatization of public space, and a de facto privatization of health information: Google now holds an almost total monopoly on searches for self-diagnosis and self-care. 

Former Special Rapporteur on the Rights of Persons in Extreme Poverty, Philip Alston, warned that digitization might become a Trojan Horse for privatization of many public services. In the digital transformation of health, this creates new challenges for human rights It’s essential for states to foster local voices in innovation, ownership and governance of digital health (Silicon Valley-based companies are unlikely to spend time and money developing sexual and reproductive health information in Ga language for sex workers, for example).

In the current heightened public anxiety over the role of AI in the future, we may miss the chance, and the urgent need, to reshape how the digital transformation is right now affecting our individual relationships with our own bodies, as well as with the body politic that represents and governs nations. This new report is a first step in that direction.

The report by the UN Special Rapporteur on the Right to Health will be the subject of a hybrid UN Human Rights Council side event, “Digital innovation, new technologies and the right to health” hosted by the Geneva Graduate Institute on 23 June 2023 from 13:00 – 14:30 CET. Click here for more information and registration.

Print Friendly, PDF & Email
Topics
Artificial Intelligence, Featured, General, Global Health Law, Technology
No Comments

Sorry, the comment form is closed at this time.