Skip to main content

Too much information?!

Blog Index
Recent Posts
Too much information?!
9/27/2022
Too much information?!
By Abigail Holt
Posted: 2022-09-27T15:44:00Z


Become a member



By Judge Abigail Holt (Tribunal Judge in the Immigration and Asylum and Employment Tribunals in Manchester, UK and Barrister at Garden Court Chambers, London)


As Judges and lawyers, I guess that we often feel like we are drowning in information. I am sure colleagues constantly feel tyrannised by several email accounts, each with groaning in-boxes; revisions of court admin; continuous case law and legal development updates; text messages and WhatsApp’s communications; news media updates; skim-read academic articles which are filed under “that-might-come-in-handy”; and all this before we consider the “sacred” information in our cases, that which has the status of evidence. 


Against this background, the UN Day for Universal Access to Information was created as a day to recall and celebrate the importance of information as the life-blood of democracy. Information also being, of course, the currency of science and the bedrock of justice. Seeing the current tragedy of the Taliban effectively banning teenage girls from school in Afghanistan, thus blocking their access to information and skills in how to manage information[1], is a stark reminder of the fundamental importance and role of information in personal and career development, wellbeing and health of the body politic. “The denial of education violates the human rights of women and girls” the UN Human Rights High Commissioner has said “Beyond their equal right to education, it leaves [Afghan girls] more exposed to violence, poverty and exploitation”.[2]


It is said that information is power, and that disinformation is abuse of power. This is particularly so in relation to access to governmental information, where oversight and accountability are crucial in respect for the rule of law, and in relation to which the rise of e-governance and artificial intelligence generates massive challenges for judges, who are already relentlessly fire-hosed by information as it is.

The information revolution that we are living through is particularly unsettling because it has a “wild west” quality; where regulators are struggling to keep up with developments. The exponential rise in data-based computational systems challenges the values and ethics underpinning our legal systems, particularly where many such electronic systems are built on analysing patterns in data and information in order to predict outcomes; various forms of modelling, analogous to epidemiology. Regulators are rising to the challenge, for example the European Union’s proposed Artificial Intelligence Act[3] and the UK Law Commission consultation on digital assets[4], but computational technology is generating huge problems, as well as opportunities, for democracy and the rule law.


Law and justice are based on precedent and so looks back at what has gone before. The law “ thus accumulates, but it never passes; at any instant, it represents a totality. It is by definition complete, yet its completeness does not preclude change[5]. In contrast, not only is the law grappling with the myriad of changed behaviours facilitated by technology, but computer science techniques often look forwards by analysing vast datasets at speed and spotting patterns, resulting in data-based predictions. These data-based “decisions” are thus increasingly embedded in our world, with traditional legal techniques and accountability struggling to access what is going on “behind the scenes” or “under the bonnet”. Whilst lawtech is a new and mushrooming profession, few senior lawyers or judges know anything much about coding or computer science. Perhaps more worrying, it can only be imagined that computer scientists’ knowledge of law is as hazy as the population at large, and such IT professionals they are increasingly contracted to build the systems through which we navigate the world, including our political systems underpinning democracy.          


Myriad issues flow from this, including, firstly, that artificial intelligence (AI) is not artificial and it is often not intelligent[6]. Computer systems rely on massive sources of energy and material to run them. For example, bitcoin “mining” uses huge environmental resources[7], the rare metals used in computer hardware parts are extracted by industries often operating in regimes not famous for their respecting of workers’ or human rights, and many data systems in fact rely on humans carrying out mindless, repetitive tasks as data-inputters or to correct or to “help” the “artificial” part of the systems to function.


Further, whilst prima facie, data-recognition systems appear to be neutral, they can be set by their algorithms consciously or subconsciously in ways that result in discrimination and thus, embed biases. At a basic level, because AI focusses on classification, “data points” such as gender, race and sexuality are at risk of being treated as natural, fixed and detectable biological categories and a machine can then be programmed to assign, say, gender identities or race characteristics. Consequently, Artificial Intelligence is not “an objective, universal or neutral computational technique that makes determinations with our direction”. As Kate Crawford has written of AI, “Its systems are embedded in social, political, cultural and economic worlds, shaped by humans, institutions, and imperatives that determine what they do and how they do it” and so there is huge risk that AI systems are designed to discriminate, amplify hierarches and encode narrow classifications. Thus, AI and the “fruits” of computational science are tools of power and therefore subject to being “weaponised” against populations by those unconstrained by, or ignorant of, rule of law considerations and when many would be sceptical about the AI industry’s ability to regulate itself. Against this background, the role of AI in justice and as a tool to replace judicial decision-making, (where thought-leaders such as Professor Richard Susskind[8] have said that use of such technologies is inevitable), is particularly worrying without anxious scrutiny being given to such systems, particularly in the context of austerity policies and programmes aimed at reducing financial investment in state institutions. 


Nonetheless, evolving AI and computer science technologies also brings opportunities which could be harnessed by the justice system. Not just for enhanced commerce, and smoother international trade or for setting global dispute-resolution standards in the most sophisticated legal systems, which must be crucial for effective regulation of global warning, but also for individual court-user citizens, particularly the most vulnerable who struggle to get access to justice. For example, during the first lockdown I followed with interest a fantastic volunteer project at Suffolk University, Boston, Massachuestts  Closing the COVID-19 Justice Gap (suffolk.edu) where lawtech students and academics teamed up and collaborated with the local judiciary, social workers and others in the justice sector to create simple, self-populating questionnaires and similar tools to allow access to the courts and judges when the court doors were literally locked. The focus was on domestic violence and housing eviction cases benefitted women in particular; protecting from domestic violence; keeping tabs on partners who had access to their children and who might not have returned them if the courts were not continuing to supervise them; and to protect individuals from being rendered homeless by ruthless landlords when the pandemic adversely affected their ability to keep up with their rent.


As “The Times” journalist Caitlin Moran has written, humans are a “can do” species. In my experience, members of the IAWJ are blessed with copious “can do” genes. The consequences of information technology and artificial intelligence and its effects on human behaviour are issues that we are already facing in our cases, but judicial expertise needs to be part of the moves to regulate Artificial Intelligence and embed legal and ethical norms within such technologies, particularly when computational techniques become part of the toolkit of the administration of justice. “As science begins to change the social world, great transformations of factual inquiry lie ahead for all justice systems[9].


[1] Taliban ban girls from secondary education in Afghanistan | Afghanistan | The Guardian

[2] ‘Is it a crime to study?’: outcry as Taliban bar girls from secondary schools | Afghanistan | The Guardian

[3] The Artificial Intelligence Act |

[4] Law Commission Documents Template

[5] Just in Time: Temporality and the Cultural Legitimation of Law. Carol Greenhouse The Yale Law Journal Vol.98: 1631 p1640

[6] Kate Crawford “Atlas of AI: Power, Politics and the Planetary Costs of Artificial Intelligence” Yale University Press 2021

[7] Bitcoin as big oil: the next big environmental fight? | Ethan Lou | The Guardian

[8] Online Courts and the Future of Justice (Oxford Academic)

[9] Prof Mirjan Damaska quoted by Caroline Foster in “Science and the Precautionary Principe in International Courts and Tribunals” (Cambridge University Press)