• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy
  • Home
  • About
  • Posts
  • Blogs
    • B2022
    • The IP Blog
    • Public Law & Regulation
    • AI
    • The Unified Patents Court

LoupedIn

A conversation on the future regulation of AI

July 29, 2020, Gowling WLG

A conversation on the future regulation of AI

On 22 June 2020, the British Institute of International and Comparative Law (BIICL) hosted a webinar on the future regulation of artificial intelligence. The event was led by Lord Clement-Jones CBE – former chair of the House of Lords Select Committee on AI and Co-Chairman of the All-Party Parliamentary Group on AI – and included Prof Christopher Hodges of Oxford University, Paul Nemitz a principle advisor at the European Commission, Jacob Turner of Fountain Court Chambers and Claudia Pagliari of Edinburgh University.

The need for regulation is being recognised globally and ever more pressingly. Germany’s Data Ethics Commission released a report with its recommendations for the EU in October 2019. This year, the European Commission published its White Paper on Artificial Intelligence and the UK’s Centre for Data and Innovation published its AI Barometer; all of which hold common concerns at their core. Against this background, the webinar covered a number of issues around why AI should be regulated, how and by whom.

In this blog, we pick out three of the key threads.

Trust-based regulation vs deterrence-based regulation?

Prof Hodges suggested that what he described as the ‘traditional model of regulation’ – whereby compliance is secured using a legalistic model based on deterrence – will not work for AI. This is because of what he sees as the inability of a state-based model to effectively police the global companies that will be at the forefront of AI.

As an alternative to a deterrence-based model, he posited a need to build a regulatory system that is built on fairness – where the players are compliant because they see it is the right thing to do, rather than acting solely out of fear of enforcement.

Prof Hodges pointed to changing ethical and cultural approaches in relation to the traditional debate over shareholder vs stakeholder value. He highlighted sectors such as civil aviation and the nuclear industry that have developed an open culture built on trust. This involves sharing information, including owning up to mistakes, which allows the industry to learn.

Prof Hodges argued that whilst large fines capture the attention of a company’s board, they end up being accepted as part of the cost of doing business, an expense like any other.

In addition, stringent enforcement measures can lead to regulators being seen as the enemy, reducing opportunities for collaboration and development. It can also exacerbate the problem of information asymmetry between the company and the regulator as the company seeks to hide or deny any breach of the rules.

By contrast, Mr Nemitz argued that democratised rules with high fines are important because they provide the necessary impact. However, he agreed that a positive dialogue is needed, alongside deterrents, in order to enable good practices.

In order to develop a fair system built on trust, Prof Hodges set out a number of questions to form the framework in which to create clear and universal rules:

  • Who will act as the equivalent of a global parliament to make the coherent consistent rules?
  • How will the objectives, principles and rules be determined?
  • What evidence will be required to show adherence?
  • Who will monitor adherence?
  • Who will investigate and resolve any non-adherence?

Jurisdictions and sectors

In response to Prof Hodges’ trust-based approach, Lord Clement-Jones highlighted the degree to which trust depends on cultural factors that can vary from country to country, citing the higher trust of robots in Japan than in European countries as an example. Prof Hodges suggested that people will have a cross cultural understanding of right and wrong that will underpin their decisions on who can be trusted, allowing for a global regulatory matrix into which companies will be able to integrate their different cultural approaches.

Mr Nemitz, on the other hand, argued that regulation based on global consensus is not a realistic solution given the length of time that it can take for such consensus to emerge. States cannot be told to wait for such consensus before taking their own decisions to regulate the actions of, and risks brought by, AI companies in their markets. For him, the example of the internet is a lesson in the dangers of letting technologies develop unregulated.

He suggested that AI has for a long time been governed by ethical principles and neoliberal philosophies that allow global companies to go about their business relatively unimpeded by hard regulation. Perhaps unsurprisingly for someone from the European Commission, Mr Nemitz stated that regulations need to be introduced to ensure a level playing field within which all players are held to account. He alluded to the globally recognised principle that if a company sells goods and services into a market then it is subject to the rules of that market, even if the company is based outside the relevant jurisdiction.

Mr Nemitz also underlined the importance of sectorial regulation of AI. He pointed to the European Commission’s White Paper, which proposes there should be a handful of basic rules that apply in all situations but that the specifics should be defined on a sectorial basis:

  1. Risk stratification – those who produce AI or use it in the market must undertake an impact assessment of what risks they are creating with the AI’s intended and unintended use.
  2. Robustness – the AI must be doing what it is intending to do and perform correctly.
  3. Human oversight – where a human takes over from an algorithm needs to be secured.
  4. Specificity – there will be specific requirements for certain applications, AI has to comply with all the rules of the sector. People must know they are dealing with AI and not a human being.
  5. Limitations – AI must not do what a human is forbidden from doing.

Whilst broadly agreeing, with these rules, barrister Jacob Turner challenged the last point. He argued that where certain legal limitations are imposed on humans based on their capacities, if AI is not subject to the same human frailty then there is no justification for imposing the same limitations. He gave the example of the 70mph speed limit imposed for safety reasons on the UK’s roads and suggested that if self-driving cars can be developed to drive safely at 150mph, they should be allowed to do so.

Hard regulations and moral obligations

Mr Turner grouped the existing framework for AI regulation into three categories:

  1. existing rules of general application which impact AI,
  2. rules which are designed specifically for AI, and
  3. self-imposed restraints by companies.

He cited the General Data Protection Regulation as the main piece of existing regulation affecting AI, in particular articles 13 – 15 which relate to automated decision-making. He saw the ‘right to explainability’ as the most important of those rules, under which the process behind significant decisions made by algorithms must be capable of understandable explanation. However, AI can sometimes be hard to explain in a traditional manner.

Although the White Paper from the European Commission is to be welcomed, elements of it are controversial and there is still a large gap in terms of AI-specific regulation. In order to fill this vacuum, companies are attempting to self-regulate. Mr Turner pointed to the recent action by Microsoft, along with Amazon and IBM, banning their facial recognition software from being used by police in the US. Mr Turner welcomed such moves by companies and recommended the introduction of internal AI ethics committees.

Claudia Pagliari agreed that hard regulation is required, but suggested that good behaviour should also be encouraged in other ways. For example, she pointed to the idea of integrity as a new form of capital, pointing to Apple as an example of a company that seeks to build trust and project integrity as part of its brand. As such, self-regulation should be both encouraged and rewarded both by governments and customers.

Conclusion

All of the participants agreed that the regulation of AI is very much in its infancy.

However, it is important to note the point made by Mr Nemitz – companies often welcome clear regulation because of the certainty it brings, and the ability to direct innovation within clear tramlines.

Mr Turner agreed that it is a false dichotomy that regulation and innovation are opposed. Regulation can benefit business as it allows for planning and provides certainty.

Regulation is therefore important both from a commercial perspective and from a public policy perspective. It will also be key to developing the trust that many of the webinar’s participants considered important as the public will only trust companies so far in favouring ethics over profit. And that trust will be essential, particularly if governments do begin asking people to travel in self-driving cars at speeds of 150mph or to place their health in the hands of automated diagnostic software. The public will want to know that the state stands behind such technology, that it oversees its standards, and that it has the power and the willingness to step in and enforce should anything go wrong.

About the author(s)

Gowling WLG
See recent postsBlog biography

Gowling WLG is an international law firm operating across an array of different sectors and services. Our LoupedIn blog aims to give readers industry insight, technical knowledge and thoughtful observations on the legal landscape and beyond.

  • Gowling WLG
    South Asian Heritage Month: Sharing our stories, celebrating our roots
  • Gowling WLG
    Why good culture can’t wait: six things legal leaders can do now
  • Gowling WLG
    Ensuring the emerging geography of AI doesn’t become a TRAIN-wreck
  • Gowling WLG
    Celebrating Black History Month: Stories from our community
  • Gowling WLG
    No revocation carve-out and related actions – 2nd UPC decision on the merits
  • Gowling WLG
    The first UPC decision on the merits is here
  • Gowling WLG
    Milan goes live! 
  • Gowling WLG
    Celebrating Volunteers’ Week at Gowling WLG
  • Gowling WLG
    Gowling WLG at UKREiiF 2024
  • Gowling WLG
    The AI Act and IP
  • Gowling WLG
    The USPTO’s Guidance on AI-Assisted Inventions
  • Gowling WLG
    Gowling WLG at MIPIM 2024
  • Gowling WLG
    Text and data mining – A UK Update
  • Gowling WLG
    COP28 – The role of youth, education and skills in driving climate goals
  • Gowling WLG
    The US looks at AI and copyright
  • Gowling WLG
    The EU AI Act and IP
  • Gowling WLG
    London Tech Week 2023: Health tech and innovation
  • Gowling WLG
    Everything you always wanted to know about the UPC but were afraid to ask
  • Gowling WLG
    A new dawn for pharmaceutical legislation in Europe?
  • Gowling WLG
    Unified Patent Court to start on 1 June 2023 as Germany ratifies
  • Gowling WLG
    What I have learned from my solicitor apprenticeship
  • Gowling WLG
    Copyright in the outputs of generative AI
  • Gowling WLG
    AI and copyright in 2022
  • Gowling WLG
    AI patentability and sufficiency: new UK guidance
  • Gowling WLG
    Birmingham… the City of a Thousand Sounds
  • Gowling WLG
    Let’s Go Forward Bab
  • Gowling WLG
    What’s netball, eh?
  • Gowling WLG
    How am ya bab: welcoming the world for Birmingham’s finest hour
  • Gowling WLG
    Investigating self-driving safety – what about IP?
  • Gowling WLG
    Artificial Intelligence in France
  • Gowling WLG
    Artificial Intelligence in the UK
  • Gowling WLG
    EU-based manufacturers and distributors wise to consider PI strategies in light of CJEU’s judgment
  • Gowling WLG
    The UK’s National AI Strategy: governance, regulation and law
  • Gowling WLG
    Best practice for patenting AI
  • Gowling WLG
    The Birmingham 2022 Festival – A Celebration of Creativity Across the West Midlands
  • Gowling WLG
    Will the UPC ban UK patent attorneys from representing clients before it?
  • Gowling WLG
    Everything looks set for the Unified Patents Court to go ahead this year but…. Are we really out of the woods yet?
  • Gowling WLG
    Unified Patents Court – News Update
  • Gowling WLG
    What’s next? A digital transformation roadmap
  • Gowling WLG
    Copyright vs. “fake news” – Deletion of user contributions from a copyright point of view
  • Gowling WLG
    The new Copyright Service Provider Act in Germany
  • Gowling WLG
    Life as a secondee at the Birmingham 2022 Commonwealth Games
  • Gowling WLG
    COP26: Latest updates from the climate change conference
  • Gowling WLG
    Practicable tips for trade secret protection during litigation in Germany
  • Gowling WLG
    AI and IP – what is your strategy?
  • Gowling WLG
    ESG: 5 reasons HR plays a key role
  • Gowling WLG
    Stuart Russell on AI Regulation
  • Gowling WLG
    The Unified Patent Court’s Protocol on Privileges and Immunities comes into force
  • Gowling WLG
    AI Assurance
  • Gowling WLG
    Actuaries tackle the ethics of AI and data science
  • Gowling WLG
    Will the US drive greater IP protection for AI?
  • Gowling WLG
    New government support for UK FinTech
  • Gowling WLG
    Working at a law firm: My experience as a business development student
  • Gowling WLG
    In defence of the workplace
  • Gowling WLG
    Africa Investment Conference 2021 – key takeaways
  • Gowling WLG
    Patents in 2020 – The year in review
  • Gowling WLG
    The National Digital Twin Legal Implications
  • Gowling WLG
    Pension Schemes Act 2021 and increased regulatory powers
  • Gowling WLG
    Pension Schemes Act 2021 and statutory right to transfer
  • Gowling WLG
    UK House of Lords warns against complacency towards AI
  • Gowling WLG
    UK competition authority publishes research on harm by algorithm
  • Gowling WLG
    UKIPO patent guidance updated for DABUS judgment
  • Gowling WLG
    EU report on AI-assisted creativity and invention
  • Gowling WLG
    AI and trade: the view from Europe
  • Gowling WLG
    Legal training contracts: A trainee’s perspective
  • Gowling WLG
    UK CDEI publishes review of bias in algorithmic decision-making
  • Gowling WLG
    Apply for UK Government funding for robotic AI by 20 November 2020
  • Gowling WLG
    The UKIPO’s AI-powered trade mark tool enters beta testing
  • Gowling WLG
    New UK laws to curb illegal deforestation in supply chains
  • Gowling WLG
    New guidance on AI and data protection from the ICO
  • Gowling WLG
    New EC guidance on “trustworthy” artificial intelligence
  • Gowling WLG
    Guidelines for government procurement of AI in Canada
  • Gowling WLG
    Defining artificial intelligence
  • Gowling WLG
    WIPO’s revised paper on IP policy and AI
  • Gowling WLG
    The “Gee-Pay” – The Global Partnership on Artificial Intelligence
  • Gowling WLG
    The UK takes the temperature of AI opportunities, risks and governance
  • Gowling WLG
    UK guidance on explaining AI for GDPR compliance
  • Gowling WLG
    AI Procurement Toolkit published by the World Economic Forum
  • Gowling WLG
    The Law Commission’s second consultation on autonomous vehicles
  • Gowling WLG
    Could standards for Artificial General Intelligence save humanity?
  • Gowling WLG
    Artificial intelligence in healthcare: NHSX AI Lab publishes a buyer’s checklist
  • Gowling WLG
    How should we regulate online targeting?
  • Gowling WLG
    AI in aviation: regulating autonomous flights
  • Gowling WLG
    The UKIPO launches AI-powered assessments of trademark applications
  • Gowling WLG
    The UKIPO investigates AI-powered prior art searches
  • Gowling WLG
    USPTO denies patent application for invention by AI
  • Gowling WLG
    We need to talk about whistleblowing
  • Gowling WLG
    What is the Customs Union?
  • Gowling WLG
    Autonomous vehicles: are ethical guidelines needed?
  • Gowling WLG
    5G: How will businesses benefit?
  • Gowling WLG
    Using blockchain in advertising
  • Gowling WLG
    What digital infrastructure is needed for connected and autonomous vehicles (CAVS)?
  • Gowling WLG
    Protecting designs for multigenerational living
  • Gowling WLG
    Five ways the Internet has changed business
  • Gowling WLG
    Protectionism and tech’s raw materials
  • Gowling WLG
    Mental health at work: How to support your employees
  • Gowling WLG
    Electric vehicles (EVs): What are the indirect effects?
  • Gowling WLG
    Urban mobility: planning for the future
  • Gowling WLG
    What are the risks associated with driverless cars?
  • Gowling WLG
    What are a business’ digital risks?
  • Gowling WLG
    How will infrastructure need to change for connected and autonomous vehicles (CAVs)?
  • Gowling WLG
    Current office space trends
  • Gowling WLG
    Drafting leasing agreements for tenants in the life sciences sector
  • Gowling WLG
    How does tax work in the UK?
  • Gowling WLG
    How 3D printing is bringing modern housing to life
  • Gowling WLG
    Using blockchain for land registry
  • Gowling WLG
    What are the risks and benefits of cloud services?
  • Gowling WLG
    A guide to doing business in the UK
  • Gowling WLG
    Using discretionary powers as a pension trustee
  • Gowling WLG
    How to avoid copyright infringement online
  • Gowling WLG
    How will the UK plastic ban affect the food and drink industry?
  • Gowling WLG
    A guide to how patent law works
  • Gowling WLG
    Jaguar: the heart of UK Automotive
  • Gowling WLG
    Employees, Corporate Governance and a Grand Day Out

Gowling WLG

Gowling WLG is an international law firm operating across an array of different sectors and services. Our LoupedIn blog aims to give readers industry insight, technical knowledge and thoughtful observations on the legal landscape and beyond.

Filed Under: AI, Opinion Tagged With: Artificial Intelligence (AI), Tech

Views expressed in this blog do not necessarily reflect those of Gowling WLG.

NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.

Primary Sidebar

Recent Posts

  • CJEU on Attribution of Voting Rights: “Acting in concert in another manner” under Section 34(2) WpHG contrary to EU law
  • On the right track: how rail is putting customers first
  • Reflections from the Pharma & Biotech Patent Litigation Europe Summit

Tags

Artificial Intelligence (AI) (65) Autonomous vehicles (11) b2022 (19) Birmingham 2022 (8) Birmingham 2022 Commonwealth Games (15) Brexit (23) Climate change (18) Collective defined contribution (6) COP26 (11) Copyright (11) COVID-19 (23) Cyber security (7) Data protection (8) Defined contribution (7) Dispute Resolution (15) Employment (15) employment law (14) Environment (19) Environmental Societal Governance (9) ESG (56) ESG and pensions (13) General Election 2024 and pensions (8) Intellectual Property (90) IP (12) Life sciences (9) litigation funding (9) net zero (6) Patents (41) Pensions (54) Pension Schemes Act 2021 (11) Pensions dashboards (7) Pensions in 2022 (10) Pensions law (45) Procurement (7) Public Law & Regulation (39) Real Estate (28) Retail (8) sustainability (22) Tech (58) The Week In Pensions (11) Trademarks (16) UK (15) unified patents court (9) UPC (40) Week in HR (8)

Categories

Archives

Gowling WLG is an international law firm comprising the members of Gowling WLG International Limited, an English Company Limited by Guarantee, and their respective affiliates. Each member and affiliate is an autonomous and independent entity. Gowling WLG International Limited promotes, facilitates and co-ordinates the activities of its members but does not itself provide services to clients. Our structure is explained in more detail on our Legal Information page.

Footer

  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy

© 2026 Gowling WLG