• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy
  • Home
  • About
  • Posts
  • Blogs
    • B2022
    • The IP Blog
    • Public Law & Regulation
    • AI
    • The Unified Patents Court

LoupedIn

The CDEI reports on the use of data and AI in financial services

December 15, 2020, Sushil Kuner

The CDEI reports on the use of data and AI in financial services

The Centre for Data Ethics and Innovation (“CDEI“) published a report on its Review Into Bias In Algorithmic Decision-making on 27 November 2020 (“the Report“). The Report examined the use of data and AI in recruitment, financial services, policing and local government. In this blog, I examine the CDEI’s key findings in respect of financial services.  

Use of AI in Financial Services – not a new phenomenon

The Report highlights that financial services firms have long used data to support their decision-making, ranging from being highly innovative to more risk averse in their use of new algorithmic approaches. Given the historical use of algorithms, the CDEI concluded that the financial services sector is ‘well-placed’ to adopt ‘the most advanced data-driven technology to make better decisions about which products to offer to which customers’.

However, the CDEI also recognised a serious risk that in the context of societal inequality, historic biases could become entrenched further through algorithmic systems. The Report discusses the credit sector as a prime example, where there is evidence documenting historical inequalities experienced by women and ethnic minorities in accessing credit; the Report warns of the risks of firms relying on making accurate predictions about peoples’ behaviours using algorithmic solutions based on traditional data, for example how likely they are to repay debts, where specific individuals or groups are historically underrepresented in the financial system.

In the Report, the CDEI explores the current landscape and discusses how machine learning is being used in financial services, relying on the findings of the 2019 joint survey issued by the Bank of England (“BoE“) and Financial Conduct Authority (“FCA“). I have already summarised key use cases in a previous blog but as the Report confirms, financial services firms are increasingly utilising complex machine learning algorithms in back-office functions like risk management and compliance.

Data limitations?

The Report identifies that use of more data from non-traditional sources could enable population groups who have historically found it difficult to access financial services, for example credit, due to there being less data about them from traditional sources, to gain better access in future. Still focusing on the credit sector, the Report discusses the example of ‘credit-worthiness by association’, describing the move from credit scoring algorithms using data from an individual’s credit history to drawing on additional data about an individual, for example their rent repayment history or their wider social network. However, many firms are not using social media data and are, according to the CDEI, sceptical of its value.

There are also concerns that while more data could improve inclusiveness and the representation of datasets, more data and complex algorithms could increase the potential for the introduction of indirect bias via proxy and limit the ability to detect and mitigate it. For example, opaque algorithms may unintentionally replicate systemically discriminatory results. Data points like salary may simply become substitutes for protected characteristics like ethnicity and gender, when considering issues such as the gender pay gap which requires data about the sex of each employee. The Report emphasises that the tension between the need to create algorithms which are blind to protected characteristics while also checking for bias against the same characteristics creates a challenge for organisations seeking to use data responsibly. As such, firms adopting more complex data feeds and algorithms need to test their models’ accuracy through validation techniques and ensure that there is sufficient human oversight of the process as a way to manage bias in the development of algorithmic models.  

The CDEI reported that most financial organisations they interviewed agreed that the key obstacles to further innovation in the sector were as follows:

  • data availability, quality and how to source data ethically;
  • available techniques with sufficient explainability;
  • a risk averse culture, in some parts, given the impacts of the financial crisis; and
  • difficulties in gauging consumer and wider public acceptance.

The importance of explainability in financial services

The CDEI refer to ‘explainability’ as the ability to understand and summarise the inner workings of a model, including the factors that have gone into the model. As mentioned above, explainability is key to understanding the factors causing variation in outcomes of decision-making systems between different groups and assessing whether or not this is regarded as fair.

As part of the FCA’s Principles for Business, the FCA expects firms to treat customers fairly (Principle 6) and take reasonable care to organise and control their affairs responsibly and effectively, with adequate risk management systems (Principle 3). The FCA is very much an outcomes focused regulator and expects firms to ensure that they are achieving the right outcomes for customers and mitigating any risks of harm that may be posed by their business, including the way in which their businesses are operated.

As such, the FCA expects all firms to understand how algorithmically-assisted decisions are reached in order to ensure they are treating customers fairly and achieving the right outcomes for customers, and are meeting their legal and regulatory obligations. Explainability is therefore crucial in financial services and until firms can get comfort that more advanced, complex algorithms are sufficiently transparent to enable them to understand how decisions were reached, there may be a natural limitation to the extent to which more advanced machine learning algorithms can be developed.

The final word

For financial services firms and regulators to identify and mitigate discriminatory outcomes and build consumer confidence in the use of algorithms, it is crucial for firms to have transparent and explainable algorithmic models, particularly for customer-facing decisions.

The FCA and BoE are undertaking work to assess the impact and opportunities of innovative data use and AI in the financial services sector as they recognise the benefits that AI can bring, including:

  • improved customer choice, services and more accurate pricing;
  • increased access to financial services;
  • lower cross-border transactions costs; and
  • improved diversity and resilience of system.

The CDEI will be an observer on the FCA and BoE’s AI Public Private Forum which will explore means to support the safe adoption of machine learning and AI within financial services.

About the author(s)

Photo of Sushil Kuner
Sushil Kuner
View Sushil's profile |  See recent postsBlog biography

Sushil Kuner is a London-based principal associate who advises on all aspects of financial services regulatory law, having spent eight years working within the Supervision and Enforcement Divisions of the Financial Conduct Authority (FCA).

  • Sushil Kuner
    https://loupedin.blog/author/sushilkuner/
    Ethical implications posed by the Metaverse
  • Sushil Kuner
    https://loupedin.blog/author/sushilkuner/
    FCA expectations on hybrid and remote working
  • Sushil Kuner
    https://loupedin.blog/author/sushilkuner/
    Using Machine Learning in Financial Services and the regulatory implications

Sushil Kuner

Sushil Kuner is a London-based principal associate who advises on all aspects of financial services regulatory law, having spent eight years working within the Supervision and Enforcement Divisions of the Financial Conduct Authority (FCA).

Filed Under: AI, Analysis, Opinion Tagged With: Artificial Intelligence (AI), cdei, Financial Institutions & Services

Views expressed in this blog do not necessarily reflect those of Gowling WLG.

NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.

Primary Sidebar

Recent Posts

  • UPC’s first decision concerning a second medical use patent
  • Sole(ly) aesthetic? The Birkenstock Sandal goes to the Federal Court of Justice
  • UK Litigation Funding: reform or retain?

Tags

Artificial Intelligence (AI) (62) Autonomous vehicles (11) b2022 (19) Birmingham 2022 (8) Birmingham 2022 Commonwealth Games (15) Brexit (23) Climate change (16) Collective defined contribution (6) COP26 (11) Copyright (11) COVID-19 (23) Cyber security (7) Data protection (8) Defined contribution (7) Dispute Resolution (14) Employment (14) employment law (11) Environment (18) Environmental Societal Governance (9) ESG (50) ESG and pensions (11) General Election 2024 and pensions (8) Intellectual Property (87) IP (10) Life sciences (7) litigation funding (8) net zero (6) Patents (41) Pensions (53) Pension Schemes Act 2021 (11) Pensions dashboards (7) Pensions in 2022 (10) Pensions law (43) Procurement (7) Public Law & Regulation (39) Real Estate (27) Retail (8) sustainability (21) Tech (58) The Week In Pensions (11) Trademarks (16) UK (15) unified patents court (9) UPC (40) Week in HR (8)

Categories

Archives

Gowling WLG is an international law firm comprising the members of Gowling WLG International Limited, an English Company Limited by Guarantee, and their respective affiliates. Each member and affiliate is an autonomous and independent entity. Gowling WLG International Limited promotes, facilitates and co-ordinates the activities of its members but does not itself provide services to clients. Our structure is explained in more detail on our Legal Information page.

Footer

  • Home
  • About
  • Gowling WLG
  • Legal information
  • Privacy statement
  • Cookie Policy

© 2025 Gowling WLG