UK visa applicants at the mercy of computer algorithms
The UK immigration services receive demanding numbers of applications and heavily rely on outsourcing and computer algorithms in order to process the vast amount of applications. However, the replacement of human judgement with these computer algorithms result in computers often hold a degree of bias, due to the system rejecting applicants based on their personal information.
To further this, the Home Office have been rigorously scrutinised and are being accused of utilising a ‘racist’ algorithm with reports indicating high rejection rates by applicants from certain countries. Content of Africa particularly have high rates of rejection rate. The Home Office’s current system is based on a ‘traffic light’ system: by assessing an applicant’s personal information and then assigning a colour code based on their perceived risk of them staying within the UK.
Unfortunately, thousands of legitimate applications are rejected outright but the Home Office have maintained that the system is fair and does not breach any human rights or any laws. However, the Chief inspector of Borders and immigration stated that the use of an automatic decision-making tool may ‘disadvantage groups of people based on generic markers such as age or origin'.