UK to use AI to age-assess migrants, sparking child safety fears
The UK government is planning to use Artificial Intelligence (AI) facial age estimation technology to help assess the age of asylum seekers claiming to be under 18, set for integration by 2026.
Inaccurate assessments place children in adult hotels
Entrenched AI biases could lead to more wrong decisions
After seeing fighters ravage his home, Jean thought he had found safety when he arrived in Britain but was told he was too tall to be 16 and sent to live with hundreds of adult asylum seekers, without further support.
Alone and exhausted, Jean, who used a pseudonym and did not want to reveal his home country in central Africa for privacy, said border officials told him he was 26 - a decade older than he actually was when he arrived in 2012.
"I look 10 years older because I am taller, that was the reason they gave," Jean, who had his age officially corrected years later after an appeal, told the Thomson Reuters Foundation.
"They don't believe you when you come and tell your story. I was so desperate. I really needed support. Because of one officer who made the decision, that changed my whole life."
Now, that critical decision - an initial age assessment made by border guards - is set to be outsourced to artificial intelligence and charities warn the tech could entrench biases and repeat mistakes like the one Jean endured.
In July, Britain said it would integrate facial age estimation tech in 2026 to help assess the ages of migrants claiming to be under 18, especially those arriving on small boats from France.
Prime Minister Keir Starmer is under pressure to control migration as populist Nigel Farage's anti-immigrant Reform UK party surges ahead in opinion polls.
More than 35,000 people have crossed the English Channel in small boats this year, a 33% rise on the same period in 2024.
Rights groups argue facial recognition tech is dehumanising and does not provide accurate age estimations, a sensitive process that should be done by trained experts.
They fear the rollout of AI will lead to more children, who lack official documents or who are carrying forged papers, being wrongly placed in adult asylum hotels without safeguards and adequate support.
"Assessing the ages of migrants is a complex process which should not be open to shortcuts," said Luke Geoghegan, head of policy and research at the British Association of Social Workers.
"This should never be compromised for perceived quicker results through artificial intelligence," he said in emailed comments.
Unaccompanied child migrants can access social workers, legal aid, education and other support under the care of local authorities, charities say.
The Home Office interior ministry says facial age estimation tech is a cost-effective way to prevent adults from posing as children to exploit the asylum system.
"Robust age assessments for migrants are vital to maintaining border security," a spokesperson said.
"This technology will not be used alone, but as part of a broad set of methods used by trained assessors."
DIGITAL FIXES?
As the numbers fleeing war, poverty, climate disaster and other tumult reach record levels worldwide, states are increasingly turning to digital fixes to manage migration.
Britain in April said it would use AI to speed asylum decisions, arming caseworkers with country-specific advice and summaries of key interviews.
In July, Britain signed a partnership with ChatGPT maker OpenAI to explore how to deploy AI in areas such as education technology, justice, defence and security.
"The asylum system must not be the testing ground for what are currently deeply flawed AI tools operating with minimal transparency and safeguards," said Sile Reynolds, head of asylum advocacy at charity Freedom from Torture.
Anna Bacciarelli, senior AI researcher at Human Rights Watch, said the use of such tech could have serious consequences.
"In the case of facial age estimation, in addition to subjecting vulnerable children and young people to a dehumanising process that could undermine their privacy and other human rights, we don't actually know if it works."
BIASES
Digital rights groups have criticised facial recognition tech - used by London's police at protests and festivals like Notting Hill Carnival - for extracting sensitive biometric data and for targeting specific racial groups.
"There are always going to be worries about sensitive data, biometric data in particular, being taken from vulnerable people and then sought by the government and used against them," said Tim Squirrell, head of strategy at Foxglove, a British tech rights group.
"It's also completely unaccountable. The machine tells you that you're 19. What now? How can you question that? Because the way in which that's been trained is basically inscrutable."
Automated tools can reinforce biases against certain communities, since AI is trained on old data that can reinforce historic prejudices, experts say.
Child asylum seekers have been told they were too tall or too hairy to be under 18, according to the Greater Manchester Immigration Aid Unit (GMIAU), which supports migrants.
"Children are not being treated as children. They're being treated as subjects of immigration control, which I think is linked to racism and adultification," said GMIAU's policy officer Rivka Shaw.
WRONGLY ASSESSED
For Jean, now 30, the wrong age assessment led to isolation and suicidal thoughts.
"I was frightened. My head was just all over the place. I just wanted to end my life," said Jean, who was granted asylum in 2018.
Around half of all migrants who had their ages re-assessed in 2024 - some 680 - were children and wrongly sent to adult hotels, according to the Helen Bamber Foundation, a charity that obtained data through Freedom of Information requests.
"A child going into an adult accommodation is basically put in a shared room with a load of strangers where there are no additional safeguarding checks," said Kamena Dorling, the charity's director of policy.
A July report by the Independent Chief Inspector of Borders and Immigration, which scrutinises Home Office policies, urged the ministry to involve trained child experts.
"Decisions on age should be made by child protection professionals," said Dorling.
"Now, all of the concerns that we have on human decision-making would also apply to AI decision-making."