'It’s not a surveillance program'... US govt isn't going all Beijing on us with border face-recog, official tells Congress
Lawmakers told: 'We don’t run the scans against any other databases'
A Homeland Security official on Wednesday stressed the US government department would not use facial-recognition technology to monitor American citizens.
John Wagner, the deputy executive assistant commissioner at the Office of Field Operations for Customs and Border Protection (CPB), told a House committee hearing not to expect China-style face-recognition surveillance on the nation's home soil.
Though Wagner did say Homeland Security uses facial recognition at America's borders for traveler identification, it is not, apparently, employed for surveillance purposes. US border cops only have access to "a small gallery of photos from passports and visas," the committee was told, and any faces scanned at the border aren't checked against any other datasets.
“We don’t run the scans against any other databases. It’s not a surveillance program,” Wagner added.
Those words of some reassurance aside, the lack of transparency in how the Department of Homeland Security (DHS) uses facial recognition, and other forms of biometric technology, and its apparent lackluster approach to privacy and security of the data, has spooked politicians.
Representative Bennie Thompson (D-MS), and chairman of the House Homeland Security committee in session, said face-matching machine-leaching technology's “proliferation across DHS raises serious questions about privacy, data security, transparency and accuracy. The American people deserve answers to those questions before the federal government rushes to deploy biometrics further.”
Secret Service leads the way
The Secret Service is currently running a pilot study to see if AI-backed cameras installed around the White House can accurately identify people and match the faces to an established database, the committee heard.
The study was launched in December 2018, and is expected to continue until August 2019. Joseph DiPietro, chief technology officer of the US Secret Service, said that the trial was only being tested on the faces of Secret Service employees that had volunteered to be monitored for project.
But what about the other people passing in and out of the White House that haven’t agreed to be imaged for the trial? DiPietro said their images would be purged, but didn’t go into too much detail due to the sensitive nature of the Secret Service’s work. Chairman Thompson hinted that may be more information that the pilot program is collecting, but that would have to be discussed in a future classified hearing.
Chinese government has got it 'spot on' when it comes to face-recog tech says, er, London's Met cops' top repREAD MORE
There is reason to be concerned when you don’t know when or how Uncle Sam and its subcontractors are collecting such data. The Register recently exclusively revealed that Perceptics, a provider of license-plate readers and vehicle recognition systems for America's land border crossings, had its systems pillaged by a hacker.
Hundreds of gigabytes of data, ranging from technical blueprints and documents to internal emails and sensitive records as well as some shots of drivers and their vehicles, were stolen from Perceptics' network, and leaked onto the dark web.
Wagner said Perceptics had taken and stored images of drivers and their vehicles to see if its technology could match their faces to a database. He said the subcontractor didn’t have the authority to retain those photos on its own network. “The contractor was hacked, CBP wasn’t hacked,” he added.
The border force has since terminated its contract with Perceptics. The Washington Post also reported today that officials only found out about the hack three weeks after Perceptics was ransacked.
Technology of the future?
Homeland Security is always on the lookout for new methods that will allow it to identify people with higher accuracy. Austin Gould, the assistant administrator for requirements and capabilities analysis at the TSA, pointed to its 2018 Biometrics Roadmap [PDF] report on where the organization is heading.
Although the report doesn’t discuss any specific technology, Charles Romine, the director of Information Technology Laboratory at NIST, a research lab working under the Department of Commerce, did mention convolutional neural networks.
NIST is involved in the Traveler Verification Service, a program to assess the performance of prototype and commercial algorithms to assist the DHS with automatic facial and biometric scans at the border. Romine said there was still a wide range in the performance of the algorithms tested. Although NIST doesn’t have direct knowledge of the face-matching algorithms used since they’re submitted for review as black boxes, the best ones use machine learning, apparently.
It’s well known that convolutional neural networks struggle most when trying to identify women and people of darker skin from their faces. Romine warned that it was “unlikely that every demographic will be identical in performance across the board, whether that’s age, rage, or sex.” But he did say that “those demographic effects are diminishing as technology improves." Wagner also said DHS wasn’t seeing “noticeable discrepancies against certain demographics,” too.
That’s in stark contrast with what experts said during a previous congressional hearing held by the House committee on oversight and reform in May. A panel of technical and legal experts urged lawmakers to consider a moratorium, preventing law enforcement and immigration enforcement from using the technology, over fear that the potential for inaccuracies and misuse were too great.
The DHS have been employing all sorts of biometric technology such as DNA matching or fingerprint and iris scanning for decades. So it’s unlikely that the government will scrap facial recognition despite experts calling for a moratorium or cities like San Francisco banning it.
You can rewatch the hearing here. ®