Privacy and the development of technology are competing interests that are sometimes in conflict. These considerations often present a challenging balancing exercise for technology companies developing innovative products as well as companies and individuals that elect to use those new products. The market is seeing more political debate, legislation and litigation that focus on these issues. It is no longer a discussion that is only occurring within technology companies.
On May 7, 2019, a settlement was entered into between a group of residents in a New York City apartment building and their landlord, requiring the landlord to allow the use of traditional mechanical keys instead of solely providing a smart lock that requires the installation of a mobile application. The dispute arose when the landlord removed the mechanical locks securing the entry to the elevator lobby and installed a smart lock system. A group of residents initiated a litigation alleging that, among other things, requiring the residents to use the smart lock and download a mobile application to operate the smart lock violated applicable law.1
Compared to having a mechanical key and lock, the residents alleged that the use of the smart lock in conjunction with a mobile application raised privacy and safety concerns, including the collection of location data by the smart lock maker and the fact that users under the age of 13 may not use the smart lock under the related terms of service. While this case originated in a local housing court and the settlement is not binding for future cases, it highlights that issues related to the use of Internet of Things ("IoT") and smart home devices may need to be carefully considered by everyone, not just technology companies. In the past year, there have been several attempts by various legislative bodies to address emerging concerns arising from the use of technology, including IoT and smart home devices.
On September 28, 2018, California became the first state to approve a cybersecurity law that would regulate connected devices, including most IoT and smart home devices. Starting on January 1, 2020, Senate Bill No. 327 ("SB 327") requires that a manufacturer that sells or offers to sell a "connected device" in California must equip the device with reasonable security features designed to protect the device and its information from unauthorized access, modification or disclosure.2 Specifically, SB 327 provides that reasonable security features must be:
- Appropriate to the nature and function of the device
- Appropriate to the information it may collect, contain or transmit and
- Designed to protect the device and any information contained therein from unauthorized access, destruction, use, modification or disclosure3
For a connected device that is equipped "with a means for authentication outside a local area network," the device is required to either have (a) a pre-programmed password that is unique to each device manufactured or (b) the device must contain a security feature that requires a user "generate a new means of authentication" before access is granted to the device for the first time.4 Practically speaking, this means the user must either be provided with a unique password for each device or be required to set up a unique password for each device to connect the first time. These requirements appear to target security vulnerabilities that arise from fixed default or pre-programmed login credentials.
SB 327 broadly defines a "connected device" as "any device, or other physical object that is capable of connecting to the internet, directly or indirectly, and that is assigned an Internet Protocol address or Bluetooth address."5 Accordingly, the law potentially covers more than consumer-connected devices and could include industrial and enterprise IoT devices.
The law encompasses certain limitations. Notably, SB 327 does not create a private cause of action for violations.6 Furthermore, the bill does not extend to unaffiliated third-party software or applications that a user chooses to add to the connected device or require app stores to review or enforce compliance.7 The law also does not apply to any connected device that is subject to security requirements under federal law nor does it apply to activities subject to the federal Health Insurance Portability and Accountability Act of 1996 ("HIPAA") or the Confidentiality of Medical Information Act ("CMIA"), including certain activities of covered entities, providers of healthcare, business associates or healthcare service plans.8 The law also does not limit the authority of any law enforcement agency to obtain connected device information from a manufacturer.9
While the law is only applicable in California, device manufacturers who are obligated to adopt the new security requirements would likely include these security features in products for customers in other states. The bill is likely to be the first step of many to increase security requirements over IoT and smart home devices. In fact, the City of San Francisco voted on May 14, 2019 to block the use of facial recognition tools by police to search for criminal suspects.10 San Francisco is not alone in trying to address these concerns as IoT devices become more ubiquitous. Massachusetts has a pending bill that would put a moratorium on facial recognition and other remote biometric surveillance systems,11 and the proposed federal Commercial Facial Recognition Privacy Act of 2019 would ban the collection and sharing of data by users of commercial recognition technology without consent.12 While these have not been passed into law, the latest attempts highlight a trend to increase security requirements over IoT and smart home devices.
It is important for not only technology companies that develop IoT and smart home devices, but also for any party currently implementing, or considering implementing, these devices, to keep abreast of the latest legal updates. They should carefully design and adopt any safeguards or usage practices necessary to stay compliant. Both the technology itself and its use and application should be evaluated in each instance. Any resulting regulations should balance the safeguards they create against the ability to allow further innovation to occur.
1 In re Ronald Sharpe, Marybeth Mckenzie, Tony Mysak, Charlotte Pfahl and Daniel Schneider v. 517-525 West 45 LLC, Offir Naim and Shai Bernstein and Department of Housing Preservation and Development of the City of New York and New York City Loft Board, No. HP 6211/18.
2 2018 Cal. Legis. Serv. Ch. 886 (S.B. 327) (to be codified at Cal. Civ. Code 1798.91.04).
3 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.04(a)).
4 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.04(b)).
5 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.05(b).
6 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(e)).
7 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(a)); S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(b)).
8 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(d)); S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(h)).
9 S.B. 327 (to be codified at Cal. Civ. Code 1798.91.06(g)).
10 https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html
11 Bill S. 1385.
12 https://www.congress.gov/bill/116th-congress/senate-bill/847/text
This publication is provided for your convenience and does not constitute legal advice. This publication is protected by copyright.
© 2019 White & Case LLP