School reprimanded for using facial recognition technology without carrying out data protection impact assessment
A school in Essex has been issued with a reprimand by the Information Commissioners Office (ICO), after it introduced facial recognition technology to take cashless canteen payments from students without first carrying out a data protection impact assessment.
Chelmer Valley High School, which has around 1,200 pupils aged 11-18, first started using the facial recognition technology (FRT) in March 2023.
FRT processes biometric data to uniquely identify people and is likely to result in high data protection risks, the ICO said.
The watchdog said: “To use it legally and responsibly, organisations must have a data protection impact assessment (DPIA) in place. This is to identify and manage the higher risks that may arise from processing sensitive data.”
The ICO found that the school failed to carry out a DPIA before starting to use the technology - meaning no prior assessment was made of the risks to the children's information.
“The school had not properly obtained clear permission to process the students’ biometric information and the students were not given the opportunity to decide whether they did or didn’t want it used in this way”, said the ICO.
The investigation also uncovered that the school failed to seek opinions from its data protection officer or consult with parents and students before implementing the technology.
In March 2023, a letter was sent to parents with a slip for them to return if they did not want their child to participate in the FRT. Affirmative 'opt-in' consent wasn't sought at this time, meaning until November 2023 the school was “wrongly relying on assumed consent”, said the watchdog.
The ICO’s reprimand also noted that most students were old enough to provide their own consent. Therefore, parental opt-out “deprived students of the ability to exercise their rights and freedoms”.
The Commissioner made the following recommendations for the future:
- prior to new processing operations, or upon changes to the nature, scope, context or purposes of processing for activities that pose a high risk to the rights and freedoms of data subjects, complete a DPIA and integrate outcomes back into the project plans
- amend the DPIA to give thorough consideration to the necessity and proportionality of cashless catering, and to mitigating specific, additional risks such as bias and discrimination
- review and follow all ICO guidance for schools considering whether to use facial recognition for cashless catering
- amend privacy information given to students so that it provides for their information rights under the UK GDPR in an appropriate way
- engage more closely and in a timely fashion with their Data Protection Officer (DPO) when considering new projects or operations processing personal data, and document their advice and any changes to the processing that are made as a result.
Lynne Currie, ICO Head of Privacy Innovation, said: “Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.”
She added: “We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children.
“We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights.”
Chelmer Valley High School has been approached for comment.
Lottie Winson