We have all clicked buttons on webpages and apps agreeing to or opting out of certain uses of our personal data. But what are the legal principles that are driving the companies running these platforms to create those buttons, update their privacy policies or otherwise engage users about the use of data?
Online users increasingly expect companies to be thoughtful about their use of data, and the Federal Trade Commission (FTC), amongst other regulators, has increased regulatory scrutiny of smaller companies. And since 2014, the FTC’s data security program has received more resources, leading to the imposition of more fines.
If my business is online, is it subject to all 50 states’ privacy laws and regulations?
Generally, no. Most laws fall into three different categories: breach response laws, which govern what you need to do if personal information has been compromised; data security laws, which govern what protections you need to have in place for personal information; and privacy laws, which govern how you use personal information and what disclosures or consents you have to provide. The first two types of laws typically relate to more sensitive types of personal information, but the third usually applies to any information that is capable of being linked to an individual. The applicability of these laws is often triggered based on where the individuals whose information you are collecting reside, but there may also be other applicability thresholds.
All states have breach responses laws, and about half have data security laws. These laws do not have volume thresholds, so if a company collects any covered data, it will be subject to that law regardless of whether it has any physical presence in the state.
Thus far, only five states — California, Colorado, Connecticut, Utah and Virginia — have passed comprehensive privacy laws, which regulate how companies collect and use personal data. These laws, however, only apply to companies that meet certain thresholds relating to how much personal data is collected and/or gross annual revenue. In other words, most young startups will not be subject to these laws if they don’t meet those thresholds.
But there are dozens of bills introduced throughout the country for new privacy laws. So unless the federal government passes a law with preemptory effect, the patchwork is likely to expand. Unfortunately, that means that startups should assess their compliance requirements based on their specific operations and processing and should be thinking about not only where they are today but when their businesses might reach those thresholds and be ready when the time comes.
What are the next “hot” enforcement areas that young companies should start preparing for?
While there are many areas ripe for enforcement, the three that we’d highlight at this point are biometrics, web-scraping and regulation of crypto.
Everyone has a million passwords these days, so users increasingly prefer biometric recognition software as an easy way to log in to apps or websites. Biometrics include fingerprints, voiceprints, and facial and retinal measurements. But new biometric identifier laws put developers at risk. These laws require companies to obtain prior written consent before collecting biometric data that can be used to identify an individual, as well as publicly post certain aspects of their retention policies.
In 2008, Illinois became the first state to enact a biometric data privacy law (BIPA). Although it has been on the books for a while, this law has gained traction only within the past few years. In 2018, a case called Rosenbach v. Six Flags Entertainment Corp. broadened the impact of that law by making it clear that a user does not need to suffer an actual injury to have the right to sue a company for breaching the Illinois BIPA law. That means the user could make a claim solely by virtue of the fact that a company collected the information without the user’s consent — even if there was no impact to the user from that data collection.
While most biometric privacy laws have not been enacted (or have serious repercussions for companies), new companies need to keep an eye out for this evolving area of the law.
The uptake from that has been a flood of lawsuits with big dollars at stake. For example, in 2019, Facebook settled a BIPA class action lawsuit called Patel v. Facebook, Inc. for $650 million to resolve claims that Facebook collected user biometric data without consent.
Currently, only Illinois, Texas and Washington have enacted biometric laws, and only Illinois allows its citizens to sue noncompliant companies. But in 2022, seven states — California, Kentucky, Maine, Maryland, Massachusetts, Missouri and New York — have all introduced biometrics laws generally based on BIPA. There is even potential for a national biometric privacy law. Senators Jeff Merkley (D-Ore.) and Bernie Sanders (D-Vt.) introduced the National Biometric Information Privacy Act of 2020, but as of this post’s publish, it has yet to be enacted.
While most biometric privacy laws have not been enacted (or have serious repercussions for companies), new companies need to keep an eye out for this evolving area of the law, which could result in exposure for young companies. Even if a young company is not sued in connection with a biometric law, the company may still need to consider these laws for purposes of making itself attractive to investors or potential buyers. Compliance with biometric laws may be an area that investors increasingly diligence (both to ensure compliance with the law and to ensure that their portfolio companies are respecting user privacy as a matter of reputation).
Startups and young companies often develop business models focused on optimizing consumer interactions with Amazon, Facebook, and other major platforms. These startups use application programming interfaces (APIs) or web-scraping technologies that have been the source of significant litigation. Web-scraping involves the mass collection of data from publicly accessible source. While the law on web-scraping is still somewhat murky, egregious cases of web-scraping will spark privacy and security concerns and could lead to potential litigation.
The crypto space also sees potential for increased litigation. Numerous crypto thefts have led to litigation, as well as the likelihood of federal regulation — including know your customer (KYC) rules for crypto exchanges. Crypto start-ups should be aware of the changing KYC requirements to maintain compliance as regulators are going to clamp down on anonymous crypto transactions.
What language should companies watch out for in data processing agreements that vendors or enterprise customers send?
Startups should be wary of significant data security requirement “traps” that companies will often add in data processing agreements that go beyond what is required by the law and increase legal risk. These additional requirements are often not reasonable in light of the contractual processing activities. For example, enterprise clients may impose unrealistic timelines for reporting security incidents to the client — often 24 or 36 hours. Unless there is a regulatory reporting need for such a quick turnaround, and the clause is limited to breaches that are confirmed or reasonably suspected to involve client data, startups should push for more workable reporting times. Likewise, startups should be wary of onerous indemnification requirements, particularly in connection with data breaches that go well beyond the value of the underlying contract. In this regard, it can be very helpful to have these agreements reviewed by a knowledgeable attorney to help circumvent these “traps.”
If a startup is using a service provider, the startup may want to insist that the service provider provides a list of sub-processors or gets permission in order to use a sub-processor that will handle personal data on behalf of the processor/startup. Often, the service provider will not pass along the same contractual security requirements to the sub-processors without specific instruction to do so. Also, startups should be on the lookout for language allowing the service provider to move personal data outside the United States, potentially triggering foreign data privacy laws.
Kim’s Korner is a series of articles by Ballard Spahr’s emerging company and venture capital attorneys. The column is not legal advice. The substance of the column is derived from our experience working with founders and details many of the current critical issues facing startups.