Monument Advocacy memo on face recognition hearing, pt III
17 January 2020 16:52 GMT

On Wednesday, January 15th, 2020, the House Committee on Oversight and Reform held a hearing entitled “Facial Recognition Technology (Part III): Ensuring Commercial Transparency & Accuracy.” Chairwoman Carolyn Maloney (D-NY-12) and Ranking Member Jim Jordan (R-OH-04) emphasized that they were pleased to see bipartisan concern about the use of both private entities and government agencies alike using facial recognition technology.  Specifically, they both expressed a keen interest in creating bipartisan legislation that would regulate this new technology to protect the private and civil liberties of individuals. 


During the question and answer period, both Democratic and Republican Committee members voiced their concerns over the lack of transparency in how facial recognition technology is being used government and which agencies are using it. An example of misuse of facial recognition technology referenced several times by both Representatives and witnesses was the Baltimore Police Department employing the technology to match protestors with outstanding warrants. There was some disagreement among members on how accurate the media is portraying the use of the technology, with Rep. Fred Keller (R-PA-12) asking if actual capabilities match those that are portrayed, to which witness Mr. Parker attested they do not.


Rep. Ayanna Pressley (D-MA-07) and Rep. Rashida Tlaib (D-MI-13) discussed their intention to propose legislation that would prohibit the use of facial recognition technology in federal subsidized housing units. The hearing closed with Ranking Member Jim Jordan (R-OH-04) and Ranking Member Carolyn Maloney (D-NY-12) calling for a bipartisan effort to create accountability and transparency around government agencies using facial recognition technology. 



Brenda Leong- Senior Counsel and Director of AI and Ethics, Future of Privacy Forum
Dr. Charles Romine- Director, Information Technology Laboratory, National Institute of Standards and Technology
Meredith Whittaker- Co-Founder and Co-Director, AI Now Institute, New York University
Daniel Castro- Vice President and Director of Center for Data Innovation, Information Technology and Innovation Foundation
Jake Parker- Senior Director of Government Relations, Security Industry Association (SIA)



Chairwoman Carolyn Maloney (D-NY-12):

Facial recognition technology is not ready for widespread use, despite private sectors use. 
In previous hearings on the subject, we first learned how the use of facial recognition technology can impact civil rights/liberties.  We also learned that government entities use this technology on a wide scale, but do not give a great deal of public information on how they use it or what protections to privacy are in place.
This technology is completely unregulated at the federal level, which is dangerous.
Commercial facial recognition technologies have been found to misidentify racial minorities, women, children, and the elderly at significantly higher rates.
Congress has a responsibility to regulate this technology and protect the rights and privacy of the public.
The committee hopes to markup bipartisan legislation on this matter in the near future.


Ranking Member Jim Jordan (R-OH-04):

Since 2016, there has been a 20% annual growth in the use of facial recognition technology. 
Currently, there is little to no accountability for government agencies using this technology.  This technology can be incredibly helpful and is important to many areas, but there needs to be accountability.  We need to understand the implications for this technology on our Constitutional liberties.
There is a threat of a patchwork of laws being created that will lead to ambiguity.  We need a national standard for facial recognition technology. 
At the bare minimum, we need to understand how and when government is using this technology. 


Rep. Jimmy Gomez (D-CA-34):

The ACLU conducted a study that misidentified the images of a number of Members of Congress with those of people who have committed a crime, including myself.  This sparked my interest in the technology.  I then learned how extensive the use of this technology is despite the fact that it is underdeveloped. 
This issue does not rank highly on the priorities of Americans currently, but as more people are misidentified, it will gain importance. 
I appreciate both the Chair’s and the Ranking Member’s commitment to legislation.


Rep. Mark Meadows (R-NC-11):

This is an area where Conservatives and Progressives come together.  We are all committed to protecting our civil liberties and rights. 
We cannot only focus on the false positives.  Technology is moving so quickly that those false positives will be eliminated within months.  My concern is not that people will be improperly identified, but rather that companies will properly identify people and use that for malicious purposes. 
We must come together in a bipartisan manner to address this issue quickly.     




Brenda Leong- Senior Counsel and Director of AI and Ethics, Future of Privacy Forum

The power of information is a net benefit to society.  Facial recognition technology has the power to benefit society as well.
Understanding the basics of the technology is critical to regulating it.  Facial recognition technology involves matching two images, but does not impute other characteristics to them. 
The level of certainty for image verification depends on what the verification is used for.  For example, unlocking your phone will be a lower standard than identifying someone on a terrorist watch list.
The harms cannot be undersold.  People use this technology to unlock sensitive information, so it could be misused or used in unethical ways. 
There are many regulatory challenges given the scope of industries involved. 
Facial recognitional technology offers benefits or risks, based on the context.  Some people object to certain uses while others are amenable to it.  Thus, affirmative consent is key.  Exception to these consents should be firm and narrow.    

Dr. Charles Romine- Director, Information Technology Laboratory, National Institute of Standards and Technology

NIST standards help provide benchmark guidance to both industry and government agencies and ensure that the US has an international presence in the standards setting conversation.
False positives are more common in females, Asians, African Americans, Native Americans, Pacific Islanders, the elderly, and the young.  The data used to train the algorithms is key.  
False positive in one-to-many searches is important.  Even the most accurate algorithms have trouble identifying African Americans and women.  We can see from this experience that the algorithm used makes a big difference in the accuracy of the results.  
NIST is working to ensure a set of standards that can be used across industry.


Meredith Whittaker- Co-Founder and Co-Director, AI Now Institute, New York University

Facial recognition poses serious threats to civil liberties.  It does not work the way that private industry tells us.  The technology puts those who are already systemically targeted even more at risk.
Facial recognition technology is reliant upon the persistent collection of our data. 
Facial recognitional technology is also being used to identify emotion.  It is very concerning that we are using this technology to identify internal feelings based on external cues.  This ability is based on pseudo-science. 
Facial recognition is deployed by those who are already in power (police, employers, landlords, etc.) to impose on the lives of those who do not.
Congress is abdicating its responsibility if they do not act to regulate this technology.  They must stop the use of this technology in both government and private areas.


Daniel Castro- Vice President and Director of Center for Data Innovation, Information Technology and Innovation Foundation

There are many positive uses of facial recognitional technology such as in hospitals, airports, and banks. 
A majority of Americans disagree with strict limits on the use of facial technology if it means that airports cannot use it.
Facial recognition is accurate.  The systems continue to get better every year.  Many groups have voluntarily adopted strict guidelines for the use of this technology.
Congress should pass comprehensive legislation on this area.  Opt-in consent should be used for sensitive matters, but it is not feasible in all areas, so it should not be required. 
Congress should also not establish a private right of action because it will raise costs for businesses which will be passed on to consumer.
Congress should also direct NIST to expand the evaluation of facial recognition systems to reflect real world uses, such as cloud-based systems. NIST should also develop a data base for training purposes. 
Congress should fund deployment for facial recognition tech in government, such as improved security in federal buildings.
Congress should fund the development of facial recognition tech as part of its commitment to expanding AI.  This is important as China makes gains in this area.
Congress should establish a warrant requirement for law enforcement who track an individual’s movements through geo-location data from facial recognition data.
Congress should continue to provide oversight to law enforcement.  This includes ensuring facial recognition tech used at protests is done with safeguards and scrutinizing the disparity of the use of tech in communities of color. It should require the DOJ to develop best practices for state and local authorities.
There are many unambiguously positive uses of this technology, so they should limit the opportunity for misuse without limiting the opportunity for the benefits of the technology. 


Jake Parker- Senior Director of Government Relations, Security Industry Association (SIA)

SIA members are some of the biggest developers of facial recognition technologies.  We believe that this technology should only be used in a lawful manner. 
There are many ways that this technology can be used, particularly in commercial settings.
Government agencies have made use of this technology for over a decade.  For example, it has been used to successfully identify trafficking victims.
Transparency is key.  We support sensible safeguards to ensure responsible use of the technology without limits to positive uses of the technology, particularly related to public safety.  We do not support moratoriums or bans on use of the technology.,
We encourage private sector developers be brought into the conversation to provide a real-world perspective on the use of the technology.
Congress should provide NIST with the resources needed to continue to study this technology.
We do support a national data privacy policy for commercial use of this technology. 
Biometric tech companies have been working closely with NIST for years by turning over their data for study and public evaluation. 
The technology is becoming much more accurate.




Chairwoman Carolyn Maloney (D-NY-12):

Is it correct that you found false positives to be more likely among people of color?

Dr. Romine: Yes.

Is it correct that you found that women, children, and the elderly are more likely to be misidentified?
Dr. Romine: Yes
Dr. Romine: We had a large dataset, so we were able to test on across demographics.
Dr. Romine: They occurred broadly in some of the algorithms tested.
Dr Romine:  The error rates for some of the algorithms can be significantly higher, from 10 to 100 times the error rates of the identification of Caucasians.
Dr Romine: It was similar rates.  I’ll have to get back to you on the exact number, but it is substantial on some algorithms.
Dr Romine: Black women have a higher rate than either black faces broadly or women broadly.
Dr. Romine: They were substantially higher.  On the order of 10-100 times.
When doing this study, were you using men’s faces?
Did these disparities occur broadly across the algorithms you tested?
How much higher was the error rate when the algorithm was used to ID persons of color compared to white individuals?
What was the difference in the identification rate for women?
What about for African American women?
What were they?
Being falsely identified can have significant implications on the lives of individuals.  I’m concerned that this technology has shown racial, gender, and age bias.  We should not rush to deploy this technology until it is ready. 


Rep. Virginia Foxx (R-NC-05):

How competitive is the facial recognition technology market?

Mr. Parker:  Extremely competitive.  Advances in imaging technology has made it more affordable and thus more intriguing to consumers.

What about competition on accuracy rates? How can a consumer know more about the accuracy rates of different technologies?
Mr. Parker: Companies do compete in terms of accuracy.  They are competing to get the best scores on the NIST evaluations.  They also perform internal accuracy tests. 
Mr. Parker: SIA is developing a set of best use practices.  Many of our members have developed such practices.  Many companies already have to comply with GDPR, which has provided some regulation as well.
Mr. Parker: I’d be happy to provide more details later, but encryption of the data is key.  The face match system is also dependent on the proprietary software used to read the matches.
Mr. Parker:  We’re already building these safeguards into products.  It’s good practice and we anticipate a similar framework in the US
What private sector best practices exist in securing data?
Could you summarize the practices that exist for protecting personally identifiable information?
How have commercial entities conformed to new regulations from states and Europe?


Rep. Eleanor Holmes Norton (D-DC-At-large):

We’re playing catch up on this issue.  Private industry is significantly ahead of Congress.  Consumers are already embracing facial recognition in their own devices.  The public has become desensitized to cameras being everywhere.
Is Congress too far behind in addressing this issue since the public is not protesting the use of this technology in cell phones?

Ms. Leong:  Phones are a good example of the variation of this technology being used.  Apple’s facial recognition has a pretty robust standard. Other technology that identifies someone off a video feed is an entirely different risk and process and should be regulated in a different way.

Does the average consumer have any way to confirm that cell phone manufacturers are not storing biometric data on their servers?
Ms. Whittaker:  Many lawmakers and consumers do not have a way to confirm these things.  This is because this is often hidden behind trade secrets, so there is no way of regulating.  For example, Amazon did not submit their algorithm to NIST for testing.  It is up to the company to release the information.  We as consumers don’t have many opportunities to hold companies accountable in this sense.


Rep. Clay Higgins (R-LA-03):

Congress should seek a means by which to ensure that Big Brother is not coming in tandem with the technology.  I’ll submit my questions to the witnesses after the hearing.  I support bipartisan legislation on this issue. 


Rep. Stephen Lynch (D-MA-08):

I’m concerned about TikTok since it is owned by the Chinese government and so many individuals have downloaded it.  Under Chinese law, they must cooperate with the Chinese government.  They are already censoring the app.  The coordination between the company and the Chinese government is a national security concern for us. 
The other concern is Apple’s lack of cooperation with Congress.  We want them to work with us like TikTok is with China.
How do we resolve this conflict and use the data to the benefit of society?

Dr. Romine:  We must balance the risks with the benefits of the policy decisions made.  The policy decisions are outside of NIST’s purview.
Ms. Whittaker: We don’t have the answer to this question because we haven’t done the research.  We don’t know if we can protect people’s privacy and liberty when these technologies are deployed in complex geopolitical settings
Mr. Castro: We need to support encryption.  This gives the consumer the power over their data.


Rep. Michael Cloud (R-TX-27:

Is the government primarily using technologies that are developed by government or commercial entities?

Mr. Parker:  It is a mixture of both.  Federal agencies have developed their own system over time, but it is increasingly moving to a combination.

What has been the industry response to the NIST report?
Dr Romine:  Industry has been involved from the outset.  Industry feels challenged to do better. 
Mr. Castro:  Those who have participated are eager to improve, but those that haven’t are different.  For example, Amazon and Apple, they need to be included as well.
Mr. Castro:  The best performing algorithms are already there. 
Mr. Parker:  We’re reaching that point now.  The industry is really focused on further reducing those false positive error rates, but it has already come very far since 2016.  We must consider what the consequences of error rates could be.  They matter more in some contexts than in others.
Mr. Parker: The issues we have do not have to do with the technologies but rather how it is used.  Solutions should deal more with the use of the technologies, such as the proposal being floated in the Senate.
How far away are we from getting this technology to an acceptable level?
How do we get this right from our perspective? We don’t want to create an environment that prohibits natural market solutions or innovations.


Rep. Robin Kelly (D-IL-02):

Can you clarify the statement that “the public should not think of facial recognition technology as always accurate or always error prone”?  It should be always accurate.  How long will we have to wait until it is always accurate for all demographics?

Dr. Romine: I don’t know how long it will take to get there.  That statement refers to the fact that you have to look at the specific algorithm.  Some that we tested did show significant bias, but others did not.  We also need to know the context in which the algorithm is used.  Some cases have higher risks than others.  We also have to know the overall system.  NIST does not test systems that are deployed in the field and those matter as well.

Can you discuss the benefits of auditing facial recognition system?
Dr. Romine: The most important to-do is to have accurate, unbiased data on these systems so that appropriate decisions can be made regarding regulations.  We need to know the performance of these systems.  We do not view the tests we do as an audit but rather as providing accurate and actionable information.
Ms. Whittaker: Auditing is important, but we need to understand how we’re measuring these systems.  The standards we use to measure ourselves by matters greatly.  Those standards need to ask questions about how the systems are used and the data upon which they are based.