advertisement

APRIL 10, 2018

If the federal government and technology companies get their way, it won’t be long before your face becomes your boarding pass at every major airport in the United States. 

Biometrics programs using facial-recognition technology are already in the testing phase at terminals in several cities, including Los Angeles, Boston, Atlanta, and elsewhere. By the end of the year, the Department of Homeland Security (DHS) plans to expand its “biometric exit” system—under which passengers’ faces are scanned as they leave the country—to all U.S. airports with international flights. Implementation is expected to cost upwards of $1 billion and has the full support of President Trump, who signed an executive order in January 2017 calling for a speedy rollout of the technology at airports.

The companies that make biometric systems market them as more secure and convenient than tickets and passports. After all, your face can’t be forged (or so it's thought), and looking at a camera is a lot faster than fumbling with documents and waiting for them to be reviewed by customs agents. 

But some privacy and civil liberties advocates think we should be worried.

They say that in the rush to implement face scans at airports, authorites are glossing over pressing concerns relating to legal hurdles, the control and use of personal data, and even technological shortcomings. They consider the program an expansion of government surveillance and worry that varying error rates in the recognition of some faces—particularly those belonging to people of color—could lead to unequal treatment at airports. 

Harrison Rudolph of the Center on Privacy & Technology at Georgetown University coauthored a December 2017 report that raises all of those issues—and he doesn’t think enough has been done to resolve them.

“If the federal government is going to collect biometrics from American citizens,” Rudolph says, “then as a baseline [the program] should be necessary, it should be clearly authorized by Congress, and the technology should be accurate. And DHS comes up short on all three.”

How It Works

Customs and Border Protection (CBP) is testing several biometrics systems, but in general they work the same. Passengers who are leaving the country walk up to a kiosk, where a camera takes a picture of each face. The image is instantly compared with a gallery maintained by CBP (which also takes the photos of foreign visitors upon entry) or a State Department database (for U.S. citizens). If the images match, the passenger can board the flight, without once reaching for a passport or boarding pass.


That's how things work at the new "eGates" being tested at Los Angeles International Airport. They were developed by Vision-Box, a Lisbon-based company that’s a worldwide leader in creating face-recognition systems. The marketing materials of Vision-Box and other manufacturers present the technology as a long-sought means of making air travel quick and painless. Airlines, of course, would like the same result if it means happier customers (not to mention that more automation could mean fewer employees to pay). 

And so far, biometrics have the support of the public, according to the most recent Global Passenger Survey from the International Air Transport Association. Of the nearly 11,000 passengers surveyed, 64% said they favor biometric identification.     

Vision-Box is already coming up with ways to extend the technology to other parts of the airport—and beyond. The company’s Happy Flow system (a name Orwell would have grimly appreciated) is designed to use face recognition for everything from check-in to bag drop-off to security to boarding. An early version has already been deployed in Aruba.



Privacy Settings

According to airlines, manufacturers of biometric equipment, and the government, the facial images of passengers taken before boarding are not permanently stored or distributed in any way, but used solely for identity verification (you can read more about CBP’s biometrics program, including privacy assessments, here). What’s more, Americans can opt out of the face scan and board the traditional way—at least for now.

"The government’s got your picture already," said Jim Peters, chief technology officer at SITA, a company that developed facial-recognition boarding for JetBlue, when we talked to him about emerging travel trends last fall. "The idea that in an airport nobody is taking your picture is probably a little naïve. There are a lot of cameras for security, not just at airports. Your picture is getting taken when you run a stoplight."

He has a point, but as far as Rudolph of Georgetown Law is concerned, the spread of surveillance in other areas of public life doesn't justify the program. As he sees it, expanding airport face scans to U.S. citizens is an overreach of the congressional mandate. While several bills passed by Congress in the last two decades call for the development of an exit-tracking system to fight terrorism and catch those traveling with fraudulent or expired visas, Rudolph asserts that "not once has Congress authorized the collection of biometrics from American citizens."


It’s important for Congress to debate the matter, Rudolph contends, because so far, there aren’t enough regulations governing this type of surveillance. "Homeland Security has made a handful of privacy and civil liberties promises concerning what will happen to the data they collect under this program," he says, "[but] it’s just too easy for DHS to break its promises when it hasn’t issued any enforceable rules to codify them."

Race and Gender Bias?

Perhaps most troubling are claims that architects of biometrics systems are overlooking the possibility of bias in their machines. 

Civil liberties groups are concerned about the accuracy of the technology with regard to its ability to distinguish race and gender. Georgetown Law’s report cites a 2017 study from the National Institute of Standards and Technology (NIST) that found variations in the performance of biometrics systems along those lines, while groups like the American Civil Liberties Union (ACLU) have raised the alarm about biased technology at airports creating "yet another racial injustice in our society."  

At the moment, the evidence seems inconclusive. Some early studies of face-recognition systems found that the technology returned higher error rates with non-white users and women because the algorithms had been trained primarily on white males. The algorithms tested more recently by NIST, however, were more likely to mistakenly reject white men, according to the Georgetown report.

So which is it? Those contradictory results alone suggest that something remains off in the technology. That seems like justification enough for further research.   

It's a crucial matter because if airport face-recognition scans have a harder time recognizing the faces of certain travelers, that could mean, at best, an inconvenience for them and, at worst, unfair treatment. And certainly the federal government has an obligation to ensure that a program it’s overseeing deals with everyone equally. 

Customs and Border Protection officials, for their part, told the New York Times in December that face scans at airports correctly identify passengers more than 90% of the time and, in the agency's perception, have not exhibited problems relating to the race or gender of the traveler.

The ACLU and other rights advocates are pressing for more testing and more transparency, preferably involving publicly available studies conducted by independent analysts.  

Cleared for Liftoff

That might slow down a program that only seems to be picking up speed. With biometric face-recognition systems being tested at nearly a dozen U.S. airports (including some of the biggest and busiest), the new way of doing things might not have taken off entirely yet, but we’re definitely zooming down the runway. 

And after the rollout at airports, travelers could soon be using their faces to unlock rental cars and hotel rooms; in fact, Vision-Box's Smart City plan envisions that very thing. Participation would be voluntary for customers, but the risks of enhanced corporate and government surveillance would increase even as travel could become faster and more convenient for many—provided everything works properly.

It seems inevitable that face scans will come to seem increasingly normal, especially now that you can even unlock your iPhone with a look. But as the stuff of science fiction becomes reality, it’s up to public servants, watchdogs, and the rest of us to make sure that the unfair and unfree possibilities stay where they belong: in novels, not in our lives.