• Skip to primary navigation
  • Skip to main content

Data v. Privacy

  • Home
  • Contact

Virginia’s CDPA Offers a GDPR-lite Look-and-Feel

March 18, 2021 by Eric Reagan Leave a Comment

While Virginia’s new Consumer Data Protection Act (VCDPA) remains a far cry from the EU’s comprehensive GDPR, it offers a less “unique” approach to reinventing the privacy wheel than what California has given us with the CCPA/CPRA.

The VCDPA goes into effect on January 1, 2023.

Far from the 99 Articles of the GDPR, the VCDPA wraps up its legislation in 11 short sections, which I’m reading across roughly 13 cleanly-formatted pages. Right off the bat, we find GDPR-familiar definitions and naming conventions for “controllers” and “processors.” And we have a fair understanding and expectation from those familiar definitions about who we are talking about.

Scope of Application

Akin to California, however, the VCDPA has scope limitations and is applicable to companies who control or process personal data of at least 100,000 consumers — or 25,000 consumers if they derive 50% of their revenue from the sale of personal data. Further, there are carve-outs for HIPAA and other federal privacy laws.

Personal Data Rights of Consumers

The VCDPA § 59.1-573 sets out consumer personal data rights, providing for rights to know, access, correct, delete, obtain a copy, and opt out of processing.

Data Controller Responsibilities

Under the VCDPA, “‘Consent’ means a clear affirmative act signifying a consumer’s freely given, specific, informed, and unambiguous agreement to process personal data relating to the consumer. Consent may include a written statement, including a statement written by electronic means, or any other unambiguous affirmative action.'” VCDPA § 59.1-571.

Controllers have purpose limitation responsibilities “to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed.” VCDPA § 59.1-574(A)(1).

Likewise, controllers are forbidden from processing “personal data for purposes that are neither reasonably necessary to nor compatible with the disclosed purposes for which such personal processed, as disclosed to the consumer.” VCDPA § 59.1-574(A)(2). Additional processing beyond that scope requires the controller to obtain the consumer’s consent for those additional purposes.

Additionally, controllers are required to “[e]stablish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.” VCDPA § 59.1-574(A)(3). Virginia missed a golden opportunity to codify the CIA Triad (i.e., choosing the term “accessibility” over “availability“).

VCDPA § 59.1-574(A)(4) requires consent for processing sensitive data and adopts COPPA as the standard for processing a child’s data.

No Waiver of Consumer Rights

VCDPA § 59.1-574(B) outlaws any attempt to waive a consumer’s rights – such provisions “shall be deemed contract to public policy and shall be void and unenforceable.”

Controller Privacy Notice to Consumers

Controllers are required to “provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:”

  1. The categories of personal data processed by the controller;
  2. The purpose for processing personal data;
  3. How consumers may exercise their consumer rights pursuant to § 59.1-573, including how a consumer may appeal a controller’s decision with regard to the consumer’s request;
  4. The categories of personal data that the controller shares with third parties, if any; and
  5. The categories of third parties, if any, with whom the controller shares personal data. VCDPA § 59.1-574(C).

Personal Data Processing Opt-Out

Controllers are required to “clearly and conspicuously disclose” the sale of personal data to third parties or processing personal data for targeted advertising. The disclosure must include info on the manner in which a consumer may opt-out of either. VCDPA § 59.1-574(D).

Consumer Rights Portal Requirement

VCDPA § 59.1-574(E) requires controllers to establish a “secure and reliable means for consumers” to exercise their rights under the VCDPA. This method (or portal) must be described in the controller’s privacy notice.

Controllers can’t require consumers to create a new account with the controller to use its portal; however, it can require users with an existing account to use the account.

Controller vs. Processor Responsibilities

Processors take orders from the controller but must assist the controller in complying with the VCDPA, including using appropriate technical and organizational controls, as well as aiding in breach notification duties and “providing necessary information” for the controller’s data protection assessments.

Data processing agreements between a processor and controller must “include requirements that the processor shall:”

  1. Ensure that each person processing personal data is subject to a duty of confidentiality with respect to the data;
  2. At the controller’s direction, delete or return all personal data to the controller as requested at the end of the provision of services, unless retention of the personal data is required by law;
  3. Upon the reasonable request of the controller, make available to the controller all information in its possession necessary to demonstrate the processor’s compliance with the obligations in this chapter;
  4. Allow, and cooperate with, reasonable assessments by the controller or the controller’s designated assessor; alternatively, the processor may arrange for a qualified and independent assessor to conduct an assessment of the processor’s policies and technical and organizational measures in support of the obligations under this chapter using an appropriate and accepted control standard or framework and assessment procedure for such assessments. The processor shall provide a report of such assessment to the controller upon request; and
  5. Engage any subcontractor pursuant to a written contract in accordance with subsection C that requires the subcontractor to meet the obligations of the processor with respect to the personal data. VCDPA § 59.1-575(B).

The “subsection C” referenced above states:

Nothing in this section shall be construed to relieve a controller or a processor from the liabilities imposed on it by virtue of its role in the processing relationship as defined by this chapter.

VCDPA § 59.1-575(C).

Determining Controller vs. Processor Status

Determining whether a person is acting as a controller or processor with respect to a specific processing of data is a fact-based determination that depends upon the context in which personal data is to be processed. A processor that continues to adhere to a controller’s instructions with respect to a specific processing of personal data remains a processor.

VCDPA § 59.1-575(D).

Data Protection Assessements

Certain data processing activities require a controller to conduct a data processing assessment (DPA):

  1. Processing for targeted advertising
  2. Sale of personal data
  3. Profiling, when there is a risk of:
    1. unfair or deceptive treatment / unlawful disparate impact
    2. financial, physical, or reputational injury
    3. invasion of privacy
    4. other substantial injury
  4. Sensitive data processing
  5. Any processing with a heightened risk of harm
    VCDPA § 59.1-576(A).

DPAs must weigh benefits and risks, as mitigated by the controller’s safeguards. The use of de-identified data, consumer expectations, and the context of the processing in the controller/consumer relationship must all be factored into the DPA. VCDPA § 59.1-576(B).

The Attorney General can request DPAs that are relevant to investigations; however, the DPAs remain confidential and exempt from Virginia FOIA requests, and AG disclosure doesn’t amount to a waiver of attorney-client or work product privileges. VCDPA § 59.1-576(C).

De-Identified Data

The controller in possession of de-identified data shall:

  1. Take reasonable measures to ensure that the data cannot be associated with a natural person;
  2. Publicly commit to maintaining and using de-identified data without attempting to re-identify the data; and
  3. Contractually obligate any recipients of the de-identified data to comply with all provisions of this chapter.

VCDPA § 59.1-577(A).

Legal, Safety, and Research Limitations

There are several common carveouts found in VCDPA § 59.1-578 that exempt actions taken by controllers and processors in order to comply with laws and investigations, defend legal claims, protect life, respond to security incidents, and engage in research.

Additionally, controllers and processors have restrictions removed for internal research, product recalls, identifying and repairing technical errors, and “internal operations that are reasonably aligned with the expectations of the consumer.”

Wrapping up the limitations, VCDPA § 59.1-578(F) notes that processing under this section must be:

  1. Reasonably necessary and proportionate to the purposes listed in this section; and
  2. Adequate, relevant, and limited to what is necessary in relation to the specific purposes listed in this section.

The burden of proof lies on the controller to demonstrate an exemption under § 59.1-578 applies.

Enforcement and Penalties

There is no private right of action for violations of the VCDPA. Exclusive authority to enforce the VCDPA lies with the Virginia Attorney General.

The AG must give a controller or processor 30 days’ written notice of the specific provisions the AG alleges are being violated. The controller/processor then has 30 days to cure the violation and provide a written statement of the cure (and a promise not to violate again). If cured, the AG will take no further action.

If a violation continues beyond the cure period (or if a previously-cured violation reoccurs), then the AG brings an action against the controller/processor. The AG can seek an injunction and/or civil penalties of up to $7500 for each violation, along with reasonable expenses and attorney fees.

VCDPA § 59.1-5710

Consumer Privacy Fund

Civil penalties, expenses, and attorney fees collected by the AG will go into a newly-established Consumer Privacy Fund, which will be exclusively used to support the AG’s enforcement of the VCDPA.

VCDPA Work Group

A work group will be formed to review the VCDPA and “issues related to its implementation. This work group will be composed of:

  1. Secretary of Commerce and Trade
  2. Secretary of Administration
  3. Attorney General
  4. Chairman of the Senate Committee on Transportation
  5. Representatives of businesses who control or process personal data of at least 100,000 persons
  6. Consumer rights advocates

The work group’s “findings, best practices, and recommendations” regarding the implementation of the VCDPA are set to be delivered to Virginia Senate and House committees by November 1, 2021.

VCDPA § 59.1-581

Filed Under: US Privacy Law Tagged With: State Privacy Laws, US Privacy Law, VCDPA, Virginia

Patching People When We Default to Truth

March 18, 2021 by Eric Reagan Leave a Comment

When I was in college, I worked in loss prevention at JC Penney. I took the job very seriously with my $5.45/hr wage as I walked around the store in plain clothes pretending to be a customer but watching everyone else.

One day, I watched a woman in the baby clothing section grab a stack of clothes from the shelf, stuff them under her shirt, and walk out the door. As I approached her car, she was emptying the clothes into the back seat. I walked her back into the store and we waited for the police to arrive in the security office.

As I began filling out some required paperwork, she pleaded that she had done nothing wrong. She told me she had brought the clothes with her to return and decided not to. I started to wonder, could she be telling the truth? Did I miss something? She sounded so sincere… Was I wrong?

I Can’t Believe My Eyes!

My inner self was trying to rationalize that I was mistaken in what it just saw through my own eyes — that this woman was telling the truth and my eyes were the liars! I snapped out of second-guessing what I just watched happen and handed her over to the police.

Over the course of a short career in law enforcement and throughout depositions and negotiations as an attorney, this would not be the last time that my inner self would default to truth in the face of contradicting objective facts.

The Weakest Link and Truth-Default

The tired axiom that people are the weakest link in securing our data points to users as gullible or ignorant as a convenient excuse. However, the human failures point to much deeper roots in psychology. No matter how well-trained users are, we are hardwired to believe other people by default.

In Talking to Strangers, Malcolm Gladwell examined the case of the decorated Queen of Cuba, Ana Montes. She worked her way up to become a senior analyst at the US Defense Intelligence Agency and the resident expert on Cuba, all the while delivering classified information to the Cuban Intelligence Service.

Her deceit was spectacular. However, when she was cornered during an interrogation, several warning signs that she was a spy were not further pressed upon by the DIA counterintelligence officer because he rationalized her responses and defaulted to truth. As a result, her treason continued for years.

Here is a counterintelligence DIA officer (with years of specialized training and experience in catching spies and liars) who chose to reconcile incriminating evidence and behavior with a story that doesn’t add up. He simply rationalized the lies because we all default to truth.

Why?

Why do we default to truth?

According to Dr. Timothy Levine, operating on a truth-default basis “enables efficient communication and cooperation, and the presumption of honesty typically leads to correct belief states because most communication is honest most of the time.” Believing people is efficient! The downside, of course, is that we humans are particularly vulnerable to the occasional deceit.

The simple truth, Levine argues, is that lie detection does not–cannot–work the way we expect it to work. In the movies, the brilliant detective confronts the subject and catches him, right then and there, in a lie. But in real life, accumulating the amount of evidence necessary to overwhelm our doubts takes time.

Malcom Gladwell, Talking to Strangers

Social Engineering

As much as we train corporate IT systems users and the average consumer, people will invariably remain the weak link in the chain if we don’t provide compensating controls. Even the best and brightest are susceptible to deceit from the skilled social engineer.

In his book, Ghost in the Wires, notorious hacker and phone phreaker Kevin Mitnick walks us through his years of escapades as he wreaked havoc on Pacific Bell’s phone networks, among other companies and systems. He did so by understanding how the system worked and the technical and administrative controls put in place to specifically prevent him from gaining access.

In one exchange, Mitnick called a Pacific Bell switching center and encountered Bruce, a tech he had previously duped. Mitnick wanted to trace a phone number with Bruce’s help. While Bruce didn’t recognize Mitnick’s voice, he had been stung by social engineers before and requested a callback number. Unfortunately for Bruce, Mitnick was prepared with a Pacific Bell internal number that he had previously patched to his cell phone.

Now, how are we protect access to data with effective cybersecurity measures at scale when a skilled social engineer engages an employee or officer? How do we guard against a Kevin Mitnick?

I don’t think we can beat all of them – certainly not the Mitnicks of the world.

Compensating Controls for Being Human

We’re all susceptible to being deceived – from second-guessing our own eyes to trying to help out a remote technician who needs to fix a problem. However, better training and awareness can help limit the losses by people — and stronger technical and administrative controls can further mitigate the failures of people who naturally default to truth.

While people may never develop a skillset to intuitively detect deception (we are nearly universally terrible at detecting deception), social engineering awareness training coupled with appropriate technical and administrative controls can help us recognize situations that are prompting unauthorized data access attempts. Moreover, those data security measures and systems need to compensate for humanity’s truth-default bias and our inability to effectively detect deception. We can’t continue to blame people for being human.

References:

  • Talking to Strangers by Malcolm Gladwell
  • Ghost in the Wires by Kevin Mitnick
  • Levine, Timothy. (2014). Truth-Default Theory (TDT): A Theory of Human Deception and Deception Detection. Journal of Language and Social Psychology. 33. 378-392. 10.1177/0261927X14535916. https://www.researchgate.net/publication/273593306_Truth-Default_Theory_TDT_A_Theory_of_Human_Deception_and_Deception_Detection
  • DePaulo, B.M. and Pfeifer, R.L. (1986), On‐the‐Job Experience and Skill at Detecting Deception. Journal of Applied Social Psychology, 16: 249-267. https://doi.org/10.1111/j.1559-1816.1986.tb01138.x

Filed Under: Data Protection Tagged With: data protection, privacy, security, social engineering

EARN IT: The Perfect Trojan Attack on American Privacy

May 26, 2020 by Eric Reagan Leave a Comment

We are in a decades-old battle between government surveillance and individual privacy. The Crypto Wars of the 90s are back with a vengeance as Congress wields the EARN IT Act in its attack on the oft-misunderstood Section 230.

47 U.S.C. § 230 provides a relatively simple and logical safe harbor; however, it is frequently twisted in statements by politicians and in news articles to be something that it is not — a shield for Facebook and other evil social media services to hide behind as they publish their illegal content, conspire to misinform, and defame innocent citizens.

Section 230(c)(1) states:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

This single sentence provides for so much unfettered innovation and social interaction on the internet in the past 20+ years, as well as promises for unforeseen innovation in the future. Without this safe harbor, it’s hard to see how the internet works going forward. As you can glean from the title of his book, Cybersecurity Law Professor Jeff Kosseff of the U.S. Naval Academy buttresses the importance of Section 230’s safe harbor in The Twenty-Six Words That Created the Internet.

Section 230 lets people talk freely on Facebook and Twitter, upload videos to YouTube and Vimeo, and post comments on personal and commercial blogs and websites. These are the ways we connect on new and innovative online services. It helped make the world smaller. Users of these services can post content to them and, if that content is somehow actionable in a civil claim, the service is not held liable solely because the content appeared on it.

The application of Section 230’s safe harbor is a logical application of traditional distributor and publisher roles — yet it allows for online services to engage in content moderation without assuming liability for users’ content. See 47 U.S.C. § 230(c)(2).

You create a service that you open for users to make accounts and post stories, pictures, or videos. You run the service and add features, run some ads, and make a profit. You moderate the user-generated content. You try to respect people’s differing views and keep a light hand on moderation (although not required) but you take down content that violates your own community guidelines. (Or maybe you don’t moderate at all… because you don’t have to.**) It’s a fairly logical conclusion to expect the service shouldn’t be responsible for everything that users say on your service. And that’s what the Section 230 safe harbor is supposed to do.

**There are narrow exceptions to Section 230’s protections (e.g., federal criminal law, intellectual property, etc.). See 47 U.S.C. § 230(e).

What Section 230 doesn’t do, however, is to provide a safe harbor for a first-party to post content without consequence. Facebook, Google, The New York Times, my blog, and your website are all directly responsible for the content that they create. However, this simple, yet crucial, distinction seems quite near impossible for many policymakers and news outlets to comprehend.

In an interview with The New York Times earlier this year, presidential candidate Joe Biden epitomized the misconceptions of Section 230, “[The Times] can’t write something you know to be false and be exempt from being sued. But [Mark Zuckerberg] can. The idea that it’s a tech company is that Section 230 should be revoked, immediately should be revoked, number one.”

The distinction that Facebook is not creating the content that appears on its site is somehow too complex for Section 230 critics to grasp.

This misstatement of fact and law is repeated over and over by politicians and news outlets on a regular basis from both the right and left. Depending on which action the social media service of choice has taken, both Democrats and Republicans overstate the function of Section 230 to include a fundamental misunderstanding of how the internet and the Section 230 safe harbor work. Of course, watching members of Congress question Mark Zuckerburg about Facebook’s privacy practices makes it easier to see just how out of touch that decision-makers can be when it comes to technology.

But what does Section 230 have to do with encryption?

Absolutely nothing.

While Section 230 is rooted in limiting liability for the acts of third parties in publicly published content, the backdoor to encryption the government seeks is all about accessing private communications. EARN IT is just the carrot to lead tech companies to the backdoor.

The pruning of Section 230 and erosion of encryption is a middle-of-the-road issue for politicians. The EARN IT Act is bipartisan legislation — just as restrictions on encryption from the 70s, 80s, and 90s were bipartisan. The EARN IT Act was written by Republican Senator Lindsey Graham and Democrat Senator Richard Blumenthal.

While their stated purpose behind the bill is “to encourage the tech industry to take online child sexual exploitation seriously,” one only has to look at the stakeholders’ records and the short history of the public availability of encryption technology in the US to see that this is another play for power to keep private individuals from having privacy.

A Brief History of the Crypto Wars

If Congress succeeds in passing the EARN IT Act, it may indirectly achieve the goal of keeping end-to-end encryption out of the hands of the average person — a goal it has sought since the 1970s as its regulation of encryption technology under the Arms Export Control Act began to slip from its grasp.

Prior to the revolutionary discovery of public-key infrastructure by Whitfield Diffie in 1976, along with developments culminating in the Diffie-Hellman key exchange and RSA algorithm, encryption was essentially a state secret. Yes, the very thing that we use to safeguard our private communications and valuable data today was too dangerous for Americans to have just a few short years ago. The practice and academic subject matter were closely guarded and monitored by the NSA.

Even after the Diffie-Hellman and RSA revelations were made known to the world, it wasn’t taken seriously by corporate America, let alone private citizens. Researchers and corporate frontrunners who dared to experiment with encryption were harassed under the guise of weapon export regulations in order to keep them quiet or to limit the strength of their encryption so that the NSA could still crack it and, therefore, monitor ostensibly private communications around the world.

The United States’ effort to limit encryption, particularly transborder encryption, was quite successful through the early-90s. For decades it was illegal to export encryption algorithms outside the US borders. Doing so could result in a federal criminal charge for exporting illegal arms. But then, the internet’s promise began to emerge.

As the federal grasp on encryption began to crack in the 1990s, the government conspired with AT&T to push to market an encrypted cell phone with a backdoor that was only accessible by the government. The NSA created a “Clipper Chip” that would attach to AT&T’s phones, provide encryption for the calls on the device, and provide the government with a backdoor to the encryption so it could listen in. To gain access, the government had to obtain a search warrant for a wiretap. Then, the government would use an “escrowed” encryption key to access the encrypted phone line. The Clipper Chip project was a commercial failure after much public policy and technical criticism.

Coincidentally, US Representatives Chris Cox and Ron Wyden were advocating for what would eventually become the Section 230 safe harbor at the same time (in 1995) as the intelligence community was losing hold of its encryption monopoly.

The direct approach to limit encryption used by American citizens ultimately lost its steam when courts held it unconstitutional in several different legal challenges. When the federal government was challenged in its attempts to limit the export of books containing encryption algorithms, US courts held that encryption code is a free speech matter and the government’s restriction was unlawful. Other challenges in teaching college courses to non-American students failed in a similar fashion. There are situations in which encryption export is still limited today; however, those cases are more narrowly limited in circumstances involving export to specific countries and specific military technology or programs. Largely, Americans are now free to use end-to-end encryption to communicate in private.

Today, we know of so many practices that show us how weak our personal communications have become due to Edward Snowden’s revelations of bulk data collection practices under PRISM. Additionally, the DEA used administrative subpoenas for two decades to justify bulk data collection in the war on drugs. Our private data is further weakened by the plethora of data mining practices and the overwhelming number of data breaches that seem to push into the millions of records on a weekly basis. However, it finally seems like average Americans are starting to take note of government and corporate encroachment on privacy. Encryption is now, more than ever, the key to taking back our privacy.

But Everyone Hates Facebook

After having its hands smacked by US courts in the 90s and early 2000s, the federal government fully knows that it can’t directly regulate encryption. Public backlash and the First Amendment pushback would be too great to overcome.

If, however, the government can use Section 230 as a weapon to point at Facebook and other online services, it can all but mandate a backdoor into every communication service that Americans use. Else, Facebook and its ilk will be sued into oblivion for actions from which Section 230 would otherwise provide a shield.

The FOSTA-SESTA debacle fully demonstrates that companies will roll over and comply with whatever restrictions are necessary to keep their Section 230 safe harbor intact. In fact, as FOSTA-SESTA demonstrated, online services will go the extra mile by restricting additional, lawful speech to avoid even the possibility of losing their safe harbors or (if the risk of harm from the long reach of FOSTA-SESTA appears too great) they will simply shut down altogether.

Whether Facebook is the flavor of the day or another company has angered a politician or group of citizens, it’s easy to find a tech company to poke. And because nobody is bothered if Zucks and company have to eat a pile of crap, it’s easy for politicians to shove EARN IT down their throats while the rest of America cheers them on.

Facebook and its ilk have made plenty of mistakes and engaged in plenty of egregious anti-privacy and anti-competitive behavior. However, when they build the privacy backdoor at the behest of the government, they get to keep their Section 230 immunity, and American citizens have one more surveillance tool added to the law enforcement and intelligence communities’ arsenal.

You don’t have to beat them if you just make them join you.

The EARN IT Act is the next era of the Cyber Wars. The generation before us fought for the right to encryption tech by reinventing it in public during the 70s and 80s and then fighting for the right to freely use it in the 90s.

They won. We won.

Encryption technology is ours to freely use.

Unless Americans respond to the EARN IT Act with the same outrage that we did to AT&T’s Clipper Chip, we’re handing our right to privacy over to a government that spent years post-9/11 collecting bulk data from American citizens under the pretense of “national security.”

Today, the threat to encryption is rationalized by a more sympathetic plea to “think of the children.”

As Andre Mcgregor, a former FBI cyber agent points out, “The American public will never side with the terrorist or child molester [by] saying they have rights worthy of protection, and DOJ knows this.”

Attorney General Bill Barr is clear that encryption is a problem and he would most certainly conclude that a best practice under the EARN IT Act would be to restrict the use of encryption among Americans. In a press release last week, Barr lashed out at Apple over their use of encryption without a backdoor on the iPhone:

The bottom line: our national security cannot remain in the hands of big corporations who put dollars over lawful access and public safety. The time has come for a legislative solution.

In a fiery rebuke of Apple and Facebook, EARN IT Act sponsor Lindsey Graham told the companies, “You’re going to find a way to do this [encryption backdoor] or we’re going to do this for you.”

Persistent privacy advocate and Section 230 architect Senator Ron Wyden called the bluff on EARN IT, saying, “This terrible legislation is a Trojan horse to give Attorney General Barr and Donald Trump the power to control online speech and require government access to every aspect of Americans’ lives.”

While the bill supporters have bipartisan support, opponents of EARN IT are also making unusual alliances to condemn the legislation in a similar fashion that the Clipper Chip united Rush Limbaugh and the ACLU.

Now, the ACLU and the libertarian/conservative Americans for Prosperity released a joint statement in opposition to the EARN IT Act. Anytime the government proposes a law that puts the ACLU and the AFP on the same page, Americans should pay attention to what its legislature is trying to do.

The EARN IT Act is Clipper Chip 2.0

The latest attempt to backdoor encryption restrictions is a more subtle approach than what the NSA attempted in the 1990s. AT&T tried to market encryption to the masses, while intelligence and law enforcement communities pinky-promised to only peek inside if they had a warrant. EARN IT is much more subtle.

To be clear, the EARN IT Act doesn’t expressly call out encryption; however, given the players involved and their recent saber-rattling, the target of the Act is squarely aimed at American citizens’ use of end-to-end encryption.

The Act would establish a National Commission on Online Child Sexual Exploitation Prevention. This sounds like a laudable goal — a whole “National Commission” that aims to prevent online child sexual exploitation. Who wouldn’t support that?

And now the hook is in…

Just think of the children. Please! To fight against EARN IT is to attack the children.

Anything “for the children” is worth the cost — whatever it may be. Right?

This National Commission is going to protect children more than every other law that already criminalizes the online sexual exploitation of children. At least that’s what Senators Graham and Blumenthal would have us believe.

This commission is to be made up of 19 members, which include the Attorney General, the Secretary of Homeland Security, and the Chairman of the FTC (or their appointed representatives). Additionally, each of the Senate and House majority and minority leaders gets to appoint four members. So, that’s 16 members appointed by legislative leaders to go along with the three named members.

Once assembled, “the Commission shall develop and submit to the Attorney General recommended best practices that providers of interactive computer services may choose to engage in to prevent, reduce, and respond to the online sexual exploitation of children….”

These best practices will then be introduced as bills in the House and Senate. Once the best practices become law, tech companies will have to comply with the best practices or lose Section 230 immunity.

The well-founded fear is that a critical “best practice” is a backdoor to Americans’ data and communications. End-to-end encryption can’t be a best practice because the government can’t see inside.

Again, this is a bipartisan affair. Those 16 members appointed by the legislature will certainly be screened for their opinions on privacy and a willingness to do whatever it takes for the children. Then, their recommendations become a rubber stamp in Congress because there is no way a majority of Congress will vote against the children.

The Crypto Wars are a War for Privacy

If we’ve learned anything over the past 40 years, it’s that the Crypto Wars have really been the Privacy Wars. We have a government with interests that are often at odds with the interests of Americans’ privacy.

If Americans have more privacy, the government has less power. The government justifies its interests and attempts to align them with American sentiment (“but look at the terrorists” or “think of the children”). At times, this works, and Americans meet the government in the middle to surrender a little more privacy.

Prohibitions on the export of encryption technology make sense (when officials claim secret privileges, “if you only saw the classified things I see”) until the people want to protect their personal emails and corporate networks from prying eyes. And when the people ask for more encryption technology, the government gives them technology with backdoors — the Clipper Chip.

Nothing could be more patriotic than a PATRIOT ACT! Sign us up, and go America! Mission Accomplished! But then it leads to a bi-partisan slippery slope of mass surveillance on Americans instead of just the terrorists from whom we were supposed to be protected. The reach of the Patriot Act persists today. The Senate, in a bipartisan fashion with 10 Democrats joining the GOP, just rejected an amendment that would have limited the warrantless collection of browsing search history under the Patriot Act.

And now, think of the children! Of course, we want to protect the children. How could anyone oppose this? This new commission assembled by politicians who supported all the prior attacks on Americans’ privacy will now define industry best practices for private technology and communication companies. There’s no way that could go awry.

But seriously, what about the children?

It’s a problem. A very bad problem. One that the EARN IT Act won’t solve.

Senator Ron Wyden (because, of course, it’s Wyden) and Representative Anna G. Eshoo recently introduced the Invest in Child Safety Act that “would direct $5 billion in mandatory funding to investigate and target the pedophiles and abusers who create and share child sexual abuse material online.”

Despite clear congressional mandates, the Justice Department not only never requested additional funding to address this growing scourge, the agency’s current budget actually cuts more than $60 million from programs to prevent child exploitation and support victims. Instead, it has demanded backdoors in encryption, which would weaken security for every American, and make it easier for pedophiles and other predators to find and exploit children and other vulnerable populations.

This is the kind of legislation Americans should demand. It directly addresses the problem by putting the resources closest to the need. There is no need for a hand-picked bureaucratic committee with travel allowances, expense accounts, and a dubious charge to create “best practices.”

The tech community identifies an order of magnitude more leads for child sexual abuse material offenders than law enforcement can investigate. We need more boots on the ground to fix the problem. The Wyden/Eshoo Invest in Child Safety Act does that. The EARN IT Act does not.

Filed Under: US Privacy Law Tagged With: Arms Export Control Act, Backdoor, Child Safety Act, Clipper Chip, Crypto Wars, EARN IT Act, Encryption, Lindsey Graham, Ron Wyden, Section 230

  • « Go to Previous Page
  • Page 1
  • Page 2
  • Page 3

Copyright © 2025 · DatavPrivacy.com