Photo-Illustration: Jacqui Vanliew: Getty Images

Computer says no fly

Opaque surveillance tools being sold to governments with the promise they can ‘export borders’ to everywhere we board trains, planes and ships

Since the 9/11 attacks, governments and airlines have been collecting ever larger amounts of data on travellers. Passenger Name Records and Advanced Passenger Information include addresses, phone numbers, payment information, itineraries, travel dates and names of accompanying passengers. But the data is not always accurate. So what happens when it is used by companies to train AI models to predict people’s behaviour based on their past activities?

We examined four companies offering software that uses algorithmic techniques to build profiles of passengers, assess their risk, and flag different categories of people: from terrorists and human traffickers to people migrating without papers.

Executives from Swiss start-up Travizory told us how their system flags people with unusual behaviour patterns or with attributes that look similar to known offenders, using a vast array of variables to determine “who looks unusual”. The algorithms which make these determinations are “black boxes”, the company’s chief data scientist said. The software “will tell you that this person is potentially risky and this person looks different, but how it makes this decision is kind of a mystery.”

Another company, SITA, has aspirations to use traveller data to show “what people are doing, not just who they are” when they cross a border, as well as helping governments “export” their borders “to every single point on the globe where passengers can board flights, ships or trains bound for their territory”.

Interviewees expressed concerns over the accuracy of the data that these new models are based on. We spoke to a Dutch activist who discovered, after a long campaign of information requests, that his Passenger Name Records included some flights he had never taken and excluded some flights he had taken. Other experts cautioned against relying on machines to predict who could be a terrorist and warned that profiling could undermine the legal right to seek asylum.