In 2016, researchers on communications and internet policy presented a paper on “the biggest lie on the internet.” The falsehood in question? “I agree to these terms and conditions.”
The paper made a splash because it confirmed what most of us already knew about how effectively users guard their personal privacy online, albeit in the starkest terms possible. The study demonstrated that when presented with terms of service that would sign away one’s first-born child to an online service provider, 98% of users clicked “yes.” Needless to say, the results inspired little confidence in the state of tech privacy.
What is Data Privacy?
Though intuitively understood by most people, there is no universal definition for the term “privacy.” Policymakers became particularly aware of this fact in the 1960s. At that time, early computers raised questions about the ethics of centralizing personal information. A legal scholar named Alan Westin set forth the following definition, which remains relatively well accepted:
Privacy is “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.”
Tech Privacy as a Public Policy Issue
In today’s terms, policy questions central to tech privacy include:
- What information can/should internet companies collect about users?
- Who owns the personal information about an individual? (Read: Who is able to monetize it?)
- Do users need to consent to the collection and/or sharing of their personal data? If so, what are the criteria for meaningful consent?
- Can personal data collected in one context be used for an unrelated purpose?
- How transparent must tech companies be with users about their business models?
- How are misuses of data identified? How are laws enforced?
- What counts as personal data? Do social media posts, location data and/or search engine history?
Regulating Data Privacy
Scholars like Westin were only half successful in pushing Congress to protect the “claim” of people to their personal information. The U.S. Privacy Act of 1974, passed in the aftermath of Watergate, imposed limits on how the government could gather, store, and disseminate citizens’ data. For better or worse, no such constraints were applied to the private sector. Decades later, there is still no comprehensive privacy law that governs how personal data is collected or used in the United States. This puts the US in stark contrast with the European Union (EU), which notably enacted its General Data Protection Regulation, or GDPR, in 2018.
Federal-Level: Stagnant Sector-Specific Regulation
There are limited situations where privacy regulations do exist at the federal level. In these instances, regulations take the form of sector-specific laws which impose requirements on organizations that meet certain criteria. Some examples include:
- HIPAA for healthcare providers, health insurance companies, and healthcare clearinghouses
- GLBA for financial institutions, including businesses that provide financial products or services to consumers
- FERPA for all K-12 schools, colleges, and universities that receive federal funds
In today’s data economy, the piecemeal approach to privacy leaves many gaps in consumer protection unaddressed. Entities that fall outside the definition of “covered entity” specified by regulators can largely do what they want with consumer data. Health and wellness apps demonstrate this discrepancy. Although products like fertility monitors, weight loss trackers, and cognitive behavioral therapy journals all collect medical information from individual users, they are not subject to HIPAA.
As the shortcomings of sector-specific laws have grown clearer, the pressure for policymakers to act has intensified. Over the last five years, the U.S. Government Accountability Office and the Federal Trade Commission (FTC) have both made repeated calls for Congress to enact comprehensive privacy legislation. The FTC in particular has repeatedly cited the failure of their fines in curbing the behavior of the largest tech companies, given how readily they can withstand billion-dollar penalties.
State-Level: Taking the Lead on Comprehensive Privacy Laws
In the last three years, local jurisdictions across the US have signaled that they will no longer wait patiently for federal leadership on privacy regulation. Many states have begun experimenting with different approaches to tech privacy, creating a patchwork of emergent laws for companies to navigate.
As of March 2024, fourteen states have enacted comprehensive privacy laws. There are five currently in effect in California, Colorado, Connecticut, Utah, and Virginia. By early 2025, seven more will become enforceable in Delaware, Iowa, Montana, New Hampshire, New Jersey, Oregon, and Texas.
The sudden onslaught of new privacy laws may seem daunting for businesses seeking to ensure their compliance. However, upon closer inspection, most jurisdictions are following a similar script.
Broadly, there are two mechanisms of action for any privacy regulation: guaranteeing certain rights for individual users and requiring particular actions from tech companies. So far, most states have opted to combine aspects of these approaches in rather similar ways.
Generally, consumers are expected to be empowered to access, correct, delete, and opt-out of the sale of their data. Tech companies, on the other hand, are required to disclose the information they collect about users. They are also expected to provide pathways for consumers to exercise their data protection rights.
Where state privacy laws differ, the disparities relate to scope and enforcement.
Virginia’s Consumer Data Protection Act (VCDPA)
Virginia’s Consumer Data Protection Act (VCDPA) is considered by many privacy advocates to be a relatively lax and limited approach to regulation. The legislation does provide consumers with the rights to access, correct, delete, and obtain a copy of their personal data. However, the definition of “personal data” under VCDPA is rather dramatically narrowed by excluding “any information a business has reasonable grounds to believe falls within the public domain.” Given this definition, the VCDPA considers a person’s social media posts, for example, as fair game.
Additionally, the VDCPA does not include any private right of action to individual users. This leaves the responsibility of law enforcement solely to the attorney general. Virginia’s approach to tech privacy also includes exemptions for financial institutions, entities covered by HIPAA, nonprofits, and colleges and universities.
California’s Consumer Protection Act (CCPA)
By contrast, California’s Consumer Protection Act (CCPA) is more robust. For example, the legislation offers greater specificity regarding the structure and contents of privacy policies, as well as the procedures through which users may notice, access, delete, and opt-out of the sale of their personal data. Further, CCPA grants a private right of action to consumers who believe tech companies have broken the law in certain circumstances, making California the only state to offer such a provision.
CCPA has also set a precedent by seeking to fill some of the gaps created by the federal government’s approach to regulating privacy based on “covered entities.” Rather than granting exemptions for entire categories of organizations, California’s statute limits exemptions to particular data types. So, while a hospital system might not have to worry about CCPA requirements for its HIPAA-protected patient files, such a loophole would not apply to the other types of information it collects.
What Is on the Horizon for Tech Privacy Policy?
For now, California has the toughest approach to privacy policy in the US. Other states are in the process of proposing stricter privacy laws. The Maine legislature is currently considering two competing bills: the Maine Consumer Privacy Act (MCPA) and the Data Privacy and Protection Act (DPPA). While the former is closely aligned with states like Virginia, the latter would introduce compliance dimensions that are unprecedented in any US jurisdiction. Notably, the DPPA includes “data minimization” requirements, which present a challenge to the fundamental business models of many of the largest tech companies.
Policies like Maine’s DPPA would have likely been dismissed as completely untenable a few years ago. However, recent developments suggest that it might be time for change in the internet economy. Even in the absence of regulatory pressure, businesses are feeling the negative effects of amassing troves of personal data. In 2022, over 80% of organizations suffered a data breach. It’s projected that by 2025, cybercrime will cost the world $10.5 trillion annually. Users are also becoming less tolerant of companies that fail to take their privacy concerns seriously — 65% of consumers have indicated that “misuse of personal data” is the top reason they would lose trust in a brand.
Of course, the same factors encouraging states like Maine and California to push the envelope may also move the needle in Congress. In 2022, the American Data Privacy and Protection Act (ADPPA) passed the House Committee on Energy and Commerce with near unanimity. The ADPPA represented the most promising effort to regulate federal consumer data privacy to date. Opinions vary on prospects for the legislation in 2024, with a main point of contention being whether the statute will effectively undo stricter state privacy laws.
Monitor Tech Privacy Policy With Plural
Plural is the legislative tracking tool of choice for policy teams seeking to monitor tech privacy policy. With Plural, you’ll:
- Access superior public policy data
- Be the first to know about new bills and changes in bill status
- Streamline your day with seamless organization features
- Harness the power of time-saving AI tools to gain insights into individual bills and the entire legislative landscape
- Keep everyone on the same page with internal collaboration and external reporting all in one place
Create a free account or book a demo today!
More Resources for Public Policy Teams
2024 118th Congress Report
While that post-election period will include important debates regarding the funding of the government (including cash-strapped disaster relief programs) into 2025, it is safe to say that we shouldn’t anticipate any seismic shifts in policymaking as lawmakers prepare for a new Congress in January.
2024 End-of-Session Report: New York
The recent legislative session in New York saw a significant focus on the “Employment, Labor and Professional Development” policy area, with numerous bills passed addressing a wide range of issues.
Generative AI for Government Relations
Generative AI is revolutionizing various sectors, including public policy. AI is particularly valuable for tackling complex societal challenges where traditional data analysis methods fall short.