- Details
Through the ‘eyes’ of the ICO: facial recognition technology in the public sector
The use of Facial Recognition Technology (FRT) has recently come under scrutiny by the Information Commissioner’s Office (ICO) following the use of FRT in a school. Faizah Patel and Zena Stephenson look at the key issues.
FRT is a way of identifying or confirming an individual’s identity using facial recognition. FRT systems are used to identify people through photos, videos, or in real-time and is classified as biometric data which constitutes ‘special category personal data’ under data protection law and therefore needs to be treated very carefully. FRT is largely used for security and law enforcement, however, there is a growing surge in interest for use in other areas by public authorities which involve education, transport and public spaces.
ICO review
The ICO recently issued a statement on FRT usage by a local authority. North Ayrshire Council (NAC) had deployed FRT in their schools to manage cashless catering in their canteens. The ICO assessed the use of FRT and the processing of biometric data and whether FRT had been lawfully deployed. The ICO’s conclusion was that whilst it may be possible to use FRT in schools lawfully, the technology set up in this instance was likely to have infringed data protection laws. It is notable that as the biometric data processed concerned children, given ‘children merit specific protection with regards to their personal data’ under the GDPR (recital 38), this investigation resulted in heightened scrutiny.
Special category personal data needs to be treated with greater care as the processing of it is more likely to interfere with fundamental rights and/or give rise to discrimination. The lawful basis under GDPR Article 6 must be met with the addition of satisfying one of the Article 9 conditions. The ICO have highlighted why processing of biometric data is a cause for concern, given it is ‘more permanent and less alterable than other personal data; it cannot be changed easily’. Other characteristics, such as age, sex, gender or ethnicity can also be conferred.
The ICO advised that NAC clearly identify their lawful basis for processing personal data to ensure that it meets the requirements under UK GDPR. The ICO advised that the NAC did not take appropriate measures to provide transparent information, failing to communicate the rights to data subjects. Finally, the data protection impact assessment (DPIA) was subject to contention as it failed to fully mitigate the risks to individual rights and freedoms.
Other uses of FRT in the Public Sector
The use of FRT in the public sector has to date been a difficult technology to implement in the public sector. Whilst FRT can be used lawfully, it may not always be the right solution to achieve an authority’s aims.
In Bridges v Chief Constable of South Wales [2020], the use of FRT by the police force was challenged and the Court found that the use of automated facial-recognition by the force was not “in accordance with the law”. The Court held that: (1) there was no clear guidance on where the technology could be used and who would be subject; (2) the DPIA was inadequate and not compliant; and (3) reasonable steps were not taken to investigate whether the technology had a racial or gender bias.
Another example of a use in schools was in 2019 when a school in Sweden had implemented FRT as a way of monitoring pupil attendance. The Swedish data protection authority took a strict approach which resulted in a school being fined for this use of FRT. The Swedish data protection authority found that data protection rules had not been followed in respect of data minimisation, protection of special category personal data and DPIA requirements. Overall, it was found that school attendance can be recorded in less intrusive methods.
Key takeaways
The ICO’s investigation highlights the caution that ought to be exercised when utilising FRT in the public sector. The use of FRT must be justified and proportionate if there are other means by which to achieve the objective then that should be considered. We have summarised below some of the key takeaways from this case.
Consent
Consent is often the lawful basis that is most relied upon. Consent needs to be informed, specific and freely given. The requirements under Article 13 of the GDPR need to be met to enable organisations to rely on this basis. The ICO found that simply signing a consent form in this instance was deemed unsatisfactory. Organisations using FRT must be aware the right to withdraw consent and the right to object and ensure that they are respecting individuals’ objections to the processing of their personal data.
Transparency
Data subjects must be made aware of how their personal data is being processed and what their rights are under data protection. It should be clear to individuals as to what personal data is being processed, why it is being processed and how long it is being processed for. Where children are the data subjects their age ought to be considered and privacy notices should also be tailored appropriately.
Accountability
The use of FRT can be intrusive and the benefits of the technology need to be weighed against the impact it could have for individuals. When completing DPIAs, thorough and appropriate risk assessments ought to be conducted. It is essential that suitable and adequate DPIAs are in place before deploying the technology.
Faizah Patel and Zena Stephenson are Solicitors at Sharpe Pritchard LLP.
For further insight and resources on local government legal issues from Sharpe Pritchard, please visit the SharpeEdge page by clicking on the banner below.
This article is for general awareness only and does not constitute legal or professional advice. The law may have changed since this page was first published. If you would like further advice and assistance in relation to any issue raised in this article, please contact us by telephone or email
|
Click here to view our archived articles or search below.
|
|
ABOUT SHARPE PRITCHARD
We are a national firm of public law specialists, serving local authorities, other public sector organisations and registered social landlords, as well as commercial clients and the third sector. Our team advises on a wide range of public law matters, spanning electoral law, procurement, construction, infrastructure, data protection and information law, planning and dispute resolution, to name a few key specialisms. All public sector organisations have a route to instruct us through the various frameworks we are appointed to. To find out more about our services, please click here.
|
|
OUR RECENT ARTICLES
March 09, 2026
Anti-Money Laundering: Key Issues for Local Government Legal and Governance TeamsMoney laundering risk is often seen as a problem for banks, lawyers and accountants. But local authorities are far from immune. In a recent webinar hosted by Sharpe Pritchard, Corporate Partner Pete Collins explored how anti-money laundering (AML) laws apply to local…
March 09, 2026
Arts and Culture, Community and Regeneration: The Two New Streamlined Subsidy RoutesBeatrice Wood and Sophie Read explore the two new Streamlined Routes, officially introduced in February this year, to simplify the awards of certain subsidies: the Community and Regeneration route and the Arts and Culture route. This article discusses the potential impact of…
March 05, 2026
Reserve below-threshold contracts for UK or local suppliers under the 2026 OrderJuli Lau and Shyann Sheehy look into the impact of the Local Government (Exclusion of Non-commercial Considerations) (England) Order 2026, and particularly how local authorities can now reserve below-threshold contracts for UK or local suppliers.
March 05, 2026
CMO Principle and Financial Assistance Further Clarified in Latest CAT Judgment on Subsidy ControlBeatrice Wood and Oliver Dickie explore the key implications for public authorities following the latest CAT judgment on subsidy control (The Subsidy Control Act 2022: The New Lottery Company Ltd and Others v The Gambling Commission).
|
|
OUR KEY LOCAL GOVERNMENT CONTACTS
|
||
|
Partner 020 7406 4600 Find out more |
||
|
Partner 020 7406 4600 Find out more |
||
|
Rachel Murray-Smith Partner 020 7406 4600 Find out more |










Catherine Newman
