• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy
  • Home
  • About
  • Posts
  • Blogs
    • B2022
    • The IP Blog
    • Public Law & Regulation
    • AI
    • The Unified Patents Court

LoupedIn

UK CDEI publishes review of bias in algorithmic decision-making

November 27, 2020, Gowling WLG

UK CDEI publishes review of bias in algorithmic decision-making

The UK’s Centre for Data Ethics and Innovation (CDEI) has published its Review into bias in algorithmic decision-making, calling for “active steps to anticipate risks and measure outcomes” and highlighting “the urgent need for the world to do better in using algorithms in the right way: to promote fairness, not undermine it”.

The CDEI notes that an “explosion in the volumes of available data” and “the increasing sophistication and accessibility of machine learning algorithms” may enable unprecedented opportunities to detect and measure bias. But it also recognises that bias may worsen, noting that “new forms of decision-making have surfaced numerous examples where algorithms have entrenched or amplified historic biases; or even created new forms of bias or unfairness.”

The key overarching recommendations are that:

  • “Organisational leaders need to be clear that they retain accountability for decisions made by their organisations, regardless of whether an algorithm or a team of humans is making those decisions on a day-to-day basis.”
  • “Senior decision-makers in organisations need to engage with understanding the trade-offs inherent in introducing an algorithm. They should expect and demand sufficient explainability of how an algorithm works so that they can make informed decisions on how to balance risks and opportunities as they deploy it into a decision-making process.”
  • “Regulators and industry bodies need to work together with wider society to agree best practice within their industry and establish … clear standards for anticipating and monitoring bias, for auditing algorithms and for addressing problems.”
  • “Fundamental decisions about what is fair cannot be left to data scientists alone. They are decisions that can only be truly legitimate if society agrees and accepts them.”

In short, bias in AI needs action by companies’ leaders and senior decision-makers, regulators and Parliament.

For a discussion of the Review’s findings on the thorny issue of AI, bias and recruitment, see this post by my colleague and employment law expert, Jonathan Chamberlain.

About the author(s)

Gowling WLG
See recent postsBlog biography

Gowling WLG is an international law firm operating across an array of different sectors and services. Our LoupedIn blog aims to give readers industry insight, technical knowledge and thoughtful observations on the legal landscape and beyond.

  • Gowling WLG
    https://loupedin.blog/author/gowlingwlg/
    Ensuring the emerging geography of AI doesn’t become a TRAIN-wreck
  • Gowling WLG
    https://loupedin.blog/author/gowlingwlg/
    Celebrating Black History Month: Stories from our community
  • Gowling WLG
    https://loupedin.blog/author/gowlingwlg/
    No revocation carve-out and related actions – 2nd UPC decision on the merits
  • Gowling WLG
    https://loupedin.blog/author/gowlingwlg/
    The first UPC decision on the merits is here

Gowling WLG

Gowling WLG is an international law firm operating across an array of different sectors and services. Our LoupedIn blog aims to give readers industry insight, technical knowledge and thoughtful observations on the legal landscape and beyond.

Filed Under: AI, Diversity, Public Law & Regulation Tagged With: Bias

Views expressed in this blog do not necessarily reflect those of Gowling WLG.

NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.

Primary Sidebar

Recent Posts

  • Sole(ly) aesthetic? The Birkenstock Sandal goes to the Federal Court of Justice
  • UK Litigation Funding: reform or retain?
  • Arbitration Act 2025 receives Royal Assent

Tags

Artificial Intelligence (AI) (62) Autonomous vehicles (11) b2022 (19) Birmingham 2022 (8) Birmingham 2022 Commonwealth Games (15) Brexit (23) Climate change (16) Collective defined contribution (6) COP26 (11) Copyright (11) COVID-19 (23) Cyber security (7) Data protection (8) Defined contribution (7) Dispute Resolution (14) Employment (14) employment law (11) Environment (18) Environmental Societal Governance (9) ESG (50) ESG and pensions (11) General Election 2024 and pensions (8) Intellectual Property (86) IP (10) Life sciences (7) litigation funding (8) net zero (6) Patents (40) Pensions (53) Pension Schemes Act 2021 (11) Pensions dashboards (7) Pensions in 2022 (10) Pensions law (43) Procurement (7) Public Law & Regulation (39) Real Estate (27) Retail (8) sustainability (21) Tech (58) The Week In Pensions (11) Trademarks (16) UK (15) unified patents court (9) UPC (39) Week in HR (8)

Categories

Archives

Gowling WLG is an international law firm comprising the members of Gowling WLG International Limited, an English Company Limited by Guarantee, and their respective affiliates. Each member and affiliate is an autonomous and independent entity. Gowling WLG International Limited promotes, facilitates and co-ordinates the activities of its members but does not itself provide services to clients. Our structure is explained in more detail on our Legal Information page.

Footer

  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy

© 2025 Gowling WLG