Search  

by Georgia van Toorn Joanna Redden Lina Dencik and Jess Brand
26th June 2024

#standagainstpoverty manifesto audit

This article is part of a blog series published in partnership with Academics Stand Against Poverty UK, as they develop their third manifesto audit in the build up to the 2024 election. They will analyse the policies in the manifestos in relation to poverty to assess how confident they are that they will enable British society to flourish.

 

With Labour holding a solid lead in the polls, Keir Starmer’s party looks poised to win government following the 4th July election.

While Labour has been cautious to commit to any radical plans for transformation, one area where it sees potential for change is in the continued advancement of artificial intelligence (AI) to address important challenges, including the rise of poverty in the UK. Labour says it will tackle poverty in part by leveraging the power of AI to ‘break down the barriers to opportunity’ and ‘radically transform our public services’, ensuring they deliver for those in need. But confronting issues like child poverty and entrenched social insecurity demands more from the government than blind faith in AI. There is now an overwhelming body of research demonstrating how the growing reliance on AI and automated systems has increased poverty, inequity, discrimination and marginalisation. Reducing poverty and increasing security for all requires fundamentally rethinking how we organise society and the role of technology within it.

The Labour Party manifesto shows little sign of such visionary thinking. On the one hand, it recycles familiar rhetoric about tackling poverty, promising that under Labour ‘there will be no return to austerity’. On the other hand, it introduces a focus on harnessing new technologies for the public good. For example, Labour intends to establish a National Data Library to help deliver its vision for ‘data-driven public services’. Its strategy for reducing barriers to opportunity relies on increased data sharing across services, including a proposal to assign children a ‘single unique identifier’ to ensure they remain visible within child and family services, preventing them from slipping through the gaps. Elsewhere, Labour has publicised plans to continue AI’s advancement in the public sector by expanding ‘smart procurement’, making services ‘fit for the digital age’. While these plans indicate substantial transformations in public services, there is no detailed discussion of how these changes will specifically address poverty.

Instead, Labour’s platform suggests a continued commitment to AI hype and technological solutionism, rhetoric that has been prominent over successive governments, starting under Blair and intensified in the wake of Conservative and Liberal Democrat austerity measures as a way to rationalise cuts and retrenchment of public services. The COVID-19 pandemic highlighted how prevalent the turn to technology as a response to solving social issues has become – from health to education to work and beyond, with Big Tech playing a role arguably on par with the state. Of course, under Sunak, the UK government’s love affair with Silicon Valley has only increased, well captured in his continued efforts to get major tech corporations to open UK-based offices.

While Labour should be reversing this trend, Starmer seems intent on continuing this path with plans for ‘diffusing’ AI across the economy and the public sector. However, this potentially bypasses significant concerns about the longer-term consequences of a public sector that relies on typically privately owned and unaccountable computational infrastructures over which the government has limited control. It also ignores the now extensive evidence of the multiple ways in which AI and automation are already entrenching inequalities and harming the very people Labour sets out to help.

For instance, data from the Department of Work and Pensions shows that for families, transitioning to a single-parent household drastically increases the likelihood of entering poverty. In recent years, these very families have borne the brunt of changes to the universal credit system involving the automation of processes for assessing entitlement and calculating payment. Research shows that these changes have often resulted in delayed or incorrect payments, exacerbating financial instability for single-parent households already at risk of poverty.

Living with a disability also places individuals and families at risk of poverty. Work by the UK Social Metrics Commission shows that of the 14.9 million people living in poverty in 2021–22, 8.6 million of them, or 58 per cent, were living in families that included a person with a disability. And yet, the DWP allocates significant resources to surveilling recipients of disability benefits, using algorithmic data-matching tools to identify potential fraud. Disability groups say these tools inflict extreme psychological harm and financial hardship on people who rely on disability benefits for basic needs. There has even been legal action against the DWP over its use of AI to target disabled people for fraud investigation.

Often those most at risk of being harmed by public sector uses of AI and automated systems are also the ones most exposed to surveillance and automation in other spheres, whether by being managed by algorithms in precarious and low-quality gig work, being faced with unfair dismissals, or their collective organisation potential to protect rights and conditions being limited. Automation also exerts a disproportionate impact on migrants or ‘migratised’ people who are living with the continuous fear of how data sharing between different entities – landlords, schools, GPs, and the police – might suddenly affect their immigration status. Advancements in AI therefore tend to compound harms across different spheres, entrenching inequalities and exclusion.

Labour cannot continue to advance the hype of Silicon Valley and Big Tech in the face of mounting evidence of real-world harms caused by applications of AI and automated systems. Instead, it needs to actively engage with the voices calling for a different deal on technology than the one we are currently offered. There are a range of ideas about what this might involve. For example, the think tank Common Wealth has a range of proposals under the rubric of ‘Democratic Digital Infrastructure’, including the establishment of a British Digital Co-operative that would create a surveillance-free and publicly owned platform architecture, and the creation of an online platform called WeDecide.gov.uk to enable accountability and democratic control of digital infrastructures. These kinds of proposals invert and challenge the tech solutionist love affair between Silicon Valley and UK public service delivery. When it comes to democratisation, the Labour manifesto includes tentative pledges to give local communities more say in, for instance, local bus routes and football governance, and workers more input into how workplaces are run (for example Royal Mail). However, this is a missed opportunity to give communities a genuine say in how local social services use their data and how digital welfare systems are governed.

A wide range of researchers and civil society organisations stress the need to increase transparency, accountability and public participation in decision-making about AI and automation. This could involve:

  • investing in government and business AI registers to ensure a centralised and verified space to hold publicly available system information;
  • privacy commissioners resourced to proactively investigate algorithmic systems;
  • mandated public equity impact assessments of AI and Automated Decision Support systems, ensuring that assessments attend to the history of similar systems and how previous failures are being addressed.

Those proposing new systems that will inevitably impact individuals’ life chances or the wellbeing of communities and society at large must bear the responsibility to demonstrate their accuracy, effectiveness and legality. Ensuring the responsible use of technology and preventing harm will require recognising that communities have the right to refuse the AI applications that will affect them and that these rights can only be exercised by meaningful and informed public consultation and debate.

There needs to be a recognition that AI and automation do not provide immediate solutions to poverty. The problem of poverty is a structural one related to the distribution of wealth, income, opportunities and power across society, and technology tends to reflect and reproduce these patterns. There is no easy tech fix. Tackling poverty will require the new government to prioritise structural changes. As a start, it could listen to and heed the voices of frontline professionals and anti-poverty advocates who have concrete suggestions for change, none of which are about AI. If Labour is seriously committed to alleviating poverty, it needs to prioritise structural reforms that emphasise equity and social justice rather than relying on technological solutions alone.

Georgia van Toorn is a Lecturer in Public Policy and Politics at the University of New South Wales (Sydney) and Associate Investigator at the ARC Centre for Automated Decision-Making and Society (Australia). Joanna Redden is an Associate Professor at Western University in Canada. She Co-Directs the Data Justice Lab (UK) and Starling Centre (Canada). Lina Dencik is Professor and University Research Leader in AI Justice at Goldsmiths, University of London, UK and Co-Director of the Data Justice Lab. Jess Brand is a PhD student at the University of Bristol’s Centre for Sociodigital Futures and a research assistant at Goldsmiths University.

 

Read all the articles in the Academics Stand Against Poverty blog series here.

Follow Transforming Society so we can let you know when new articles publish.

The views and opinions expressed on this blog site are solely those of the original blog post authors and other contributors. These views and opinions do not necessarily represent those of the Policy Press and/or any/all contributors to this site.

Image credit ev via Unsplash