An immigration lawyer could face a disciplinary investigation after a judge ruled he used it AI tools like ChatGPT To prepare his legal research.
The court heard that A He rules over He was left puzzled when Chaudhry Rahman submitted his memoirs, which included citing cases that were “Completely fakeOr “completely irrelevant.”
A judge found that Mr Abdel Rahman also tried to “hide” this when questioned, and “wasted” the court’s time.
The incident occurred while Mr Abdel Rahman was representing two Honduran sisters who were seeking asylum in the UK on the grounds that they had been targeted by a violent criminal gang called Mara Salvatrucha (MS-13).
After arriving at Heathrow Airport in June 2022, they claimed asylum and said during screening interviews that the gang wanted them to be “their women”.
They also claimed that gang members had threatened to kill their families and had been searching for them since they left the country.

One of the authorities cited to support his case was previously incorrectly reported by ChatGPT (AP)
In November 2023, the Home Office rejected their asylum application, stating that their accounts were “inconsistent and not supported by documentary evidence.”
They appealed to a first-tier court, but the application was rejected by a judge who “did not accept that the appellants should be the target of adverse attention” from MS-13.
The case was then appealed to the Supreme Court, where Mr. Abdul Rahman acted as counsel. During the hearing, he said the judge had failed to properly assess credibility, made an error of law in evaluating the documentary evidence, and failed to consider the impact of internal transfer.
However, these claims were similarly rejected by Judge Mark Blundell, who dismissed the appeal and ruled that “nothing said by Mr Rahman orally or in writing established error of law on the part of the judge”.
However, in the annex to the judgment, Judge Blundell noted the “significant problems” that arose from the appeal, in relation to the legal research undertaken by Mr Abdel Rahman.
Of the 12 authorities mentioned in the appeal, the judge discovered after reading that some of them did not even exist, and that others “did not support the proposals of law that were cited in the reasons.”
After investigating this matter, he found that Mr Abdel Rahman appeared “unaccustomed” to legal search engines and was “unable to understand” where to direct the judge in the cases he mentioned.
Rahman said he used “various websites” to conduct his research, with the judge noting that one of the cases in question was recently wrongly published by ChatGPT in another legal case.
Justice Blundell noted that because Mr Rahman “appeared to know nothing” about any of the authorities he mentioned, some of which did not exist, all his submissions were therefore “misleading”.
“It is very likely, in my view, that Mr Abdel Rahman used artificial intelligence to formulate the grounds of appeal in this case, and that he attempted to conceal that fact from me during the hearing,” Judge Blundell said.
“He has been called to the Bar of England and Wales and it is simply not possible that he misunderstood all the powers mentioned in his grounds of appeal to the extent I have mentioned above.”
He concluded that he was now considering submitting a report on Mr Abdel Rahman to the Bar Standards Board.
Source link








