As emergency legislative responses to COVID-19 abate, lawmakers across the country—particularly in Congress—have started to turn their attention back to data privacy and cybersecurity, which were key areas of focus earlier in the year. A number of recent proposals specifically focus on regulating the use of facial recognition, with several providing for an outright ban on the use of such technology.
Federal
Several recent proposals in Congress for data privacy and security legislation would have significant implications for U.S. businesses, their online and internet-connected products and services, and relations with the federal government.Several recent proposals in Congress for data privacy and security legislation would have significant implications for U.S. businesses, their online and internet-connected products and services, and relations with the federal government.
IoT Device Security
The Internet of Things (IoT) Cybersecurity Improvement Act of 2020 has passed in the House (H.R. 1668), and remains pending for consideration by the Senate after clearing the Senate Homeland Security and Governmental Affairs Committee in June 2019 (S. 734). If enacted, the Act would require the National Institute of Standards and Technology (NIST) to develop and publish (1) minimum security standards and guidelines on the use and management of IoT devices owned or controlled by a federal government agency, including requirements for managing cybersecurity risks; and (2) guidelines for disclosing security vulnerabilities of information systems, including IoT devices, by contractors (and subcontractors) who provide the technology to the agency.
Agency heads would not be able to procure, obtain, or use an IoT device that fails to meet the standards and guidelines, unless a waiver is determined to apply (currently, the standards and procedures for waiver differ under the Senate and House bills).
If passed, the Act would complement California's IoT device security law (Cal. Civ. Code §§ 1798.91.04–1798.91.06) that went into effect on January 1, 2020. The California law, which among other things requires a manufacturer of IoT devices that are sold or offered for sale in California to equip the devices with a reasonable security feature or features that satisfy certain criteria, explicitly excludes from its scope any IoT device that is subject to security requirements under federal law, regulations, or regulatory agency guidance.
Individual Data Privacy and Security
Four Republican Senators introduced the Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act (S.3663), which seeks to establish an overarching framework for consumer data privacy and security that would preempt state law. The Act would impose data transparency, integrity, and security requirements and obligations for entities that are subject to the FTC Act (15 U.S.C. § 41 et seq.), as well as common carriers and non-profit organizations.
U.S. residents would be granted rights to be informed about, access, correct, delete, and port their personal data, and covered entities would be prohibited from denying products or services to an individual due to the exercise of any of those rights. The Federal Trade Commission would have rulemaking, supervisory, and enforcement authority. The bill is an updated version of the staff draft of the SAFE DATA Act (originally titled the U.S. Consumer Data Privacy Act) that was introduced in November 2019.
U.S. Senator Sherrod Brown (D-OH) released a discussion draft of the Data Accountability and Transparency (DATA) Act of 2020, which proposes a comprehensive framework for the collection, use, and protection of electronic personal data of individuals in both the public and private sector, to be administered by a newly created independent agency that would have rulemaking, supervisory, and enforcement authority. The bill also would authorize state attorneys general to bring enforcement actions. It would not preempt more protective state laws.
The bill would prohibit any individual or entity that collects, uses, or shares a non-de minimis amount of personal data for reasons not solely personal (“data aggregators”) from processing electronic personal data except for certain enumerated permissible purposes, and would impose transparency, disclosure, accountability, and reasonable security obligations. Notably, the bill would provide civil and criminal penalties for the CEO and Board of Directors of data aggregators who knowingly and intentionally violate or attempt to violate the annual reporting and certification requirements.
Beyond granting individuals the right to be informed about, access, correct, delete, and port their electronic personal data, the bill—similar to the EU’s GDPR—also would allow individuals to challenge the basis for data collection and request a human review of automated decisions. The bill seeks to provide civil rights protections by prohibiting the use of electronic personal data to discriminate in the provision of housing, employment, credit, insurance, and public accommodations; prohibiting the use of electronic personal data to deprive individuals of their free and fair exercise of their right to vote; and enacting a general ban on the use of facial recognition technology.
State
New York
The New York legislature passed A06787-D/S05140-B, which would impose a moratorium on the purchase and use of facial recognition and other forms of biometric identification by all elementary and secondary schools until July 1, 2022. The legislation was prompted by the adoption of facial recognition technology by the Lockport City School District in all of its elementary and secondary school buildings in 2019, which was challenged in a lawsuit filed by the New York Civil Liberties Union on behalf of parents opposing its use.
The legislation also would require the New York State Department of Education to study the issue of biometric identification in schools and craft regulations. If signed by Governor Cuomo, the law would be the first in the nation to specifically remove the technology from schools.
Oregon
The City of Portland, Ore., made national history when it unanimously adopted two ordinances banning the use of facial recognition technology by both city and private entities. The City Council’s rationale for passing the ordinance was based on general concerns about biases in the technology, privacy and civil liberties issues, and a lack of transparency and accountability regarding its operation and use.
Ordinance No. 190113 prohibits all city bureaus from acquiring or using the technology, subject to narrow exceptions for personal verification, and goes into effect immediately. It mirrors measures recently adopted in other cities (including Boston, San Francisco, and Oakland, Calif.) and states (California, Oregon, New Hampshire, and Washington) to similarly prohibit or narrowly limit government and law enforcement use of facial recognition technologies.
Ordinance No. 190114, which prohibits private entities from using the technology in public spaces subject to narrow exceptions for personal verification, is the first and only such general ban in the country. It goes into effect on January 1, 2021.
Washington
A draft bill for the Washington Privacy Act of 2021 has been released for public review and comment, although the bill cannot be formally introduced until January of next year. This is the third successive iteration of the Act, the most recent version of which failed to pass during the legislative session earlier this year due to disagreement among legislators about the inclusion of a private right of action. (We provided an in-depth review of the 2020 Act here.)
Many provisions of the draft 2021 Act are the same as or very similar to the 2020 bill. Notable differences include the omission of specific regulations for facial recognition (although biometric data is included under the category of “sensitive information” that requires consent for processing), and a section relating to data processing during a public health emergency in both the private and public sectors. Perhaps most important, the 2021 Act currently vests the state Attorney General with sole enforcement authority, although it also directs the Attorney General to submit a report evaluating the liability and enforcement provisions by July 1, 2022.
If passed and signed, the Act would go into effect 120 days after enactment.
[View source.]