• Skip to primary navigation
  • Skip to main content

Data v. Privacy

  • Home
  • Contact

HB 4938: The Great Firewall of Michigan

September 21, 2025 by Eric Reagan Leave a Comment

Thirty years ago, the US Congress tried to ban porn on the Internet via the Communications Decency Act of 1996 (CDA). The legislation was introduced with the goal of preventing harm to minors who could access objectionable material through home computers. While it was signed into law by Bill Clinton, it failed miserably with a unaminous Supreme Court striking it down in Reno v. ACLU because it was a gross overreach of protected speech and violation of the First Amendment.

Reno v. ACLU be damned, on September 11, 2025, Michigan Representative Josh Schriver introduced HB 4938 as the Anticorruption of Public Morals Act in the Michigan legislature. The bill essentially seeks to ban (1) all pornography and (2) any transgender-associated content.

Notably, it does so by forcing ISPs to filter content and (perhaps the worst idea in the bill) block VPNs.

Prohibited Content

All of the content restrictions are defined as “prohibited content” under the bill that includes an itemized list along with a catchall “any other pornographic material” as one of those items. The description addressing the transgender-associated content is defined as content that:

“Is a depiction, description, or simulation, whether real, animated, digitally generated, written, or auditory, that includes a disconnection between biology and gender by an individual of 1 biological sex imitating, depicting, or representing himself or herself to be of the other biological sex by means of a combination of attire, cosmetology, or prosthetics, or as having a reproductive nature contrary to the individual’s biological sex.”

We’ve seen a lot of shots at content restrictions through age verification laws in recent years and there appears to be a path to success in passing First Amendment muster under the current Supreme Court makeup. Notably, the Michigan Anticorruption of Public Morals Act is not an age verification law – it is an outright content ban for all Michigan residents.

In the face of age verification laws, internet users in 2025 have shrugged off such laws by turning on their VPNs. The new Michigan bill aims to close the VPN loophole by requiring ISPs to block them as “circumvention tools.”

VPNs, Proxy Servers, and Encrypted Tunnels, Oh My…

Under the proposed law, “circumvention tools” are defined as “any software, hardware, or service designed to bypass internet filtering mechanisms or content restrictions including virtual private networks, proxy servers, and encrypted tunneling methods to evade content restrictions.” As part of the content restrictions, Michigan ISPs are required to both filter prohibited material and block circumventions tools.

“An internet service provider providing internet service in this state shall implement mandatory filtering technology to prevent residents of this state from accessing prohibited material. An internet service provider providing internet service in this state shall actively monitor and block known circumvention tools.”

ISPs’ duty to filter content and block known circumventions tools brings with it fines up to $500k per violation. Do we think ISPs are going to risk penalties like this by being conservative with their content filtering or VPN blocking strategies?

While the definition qualifies the restricted tools as limited to those “designed . . . to evade content restrictions,” the required technical filtering and blocking requirement on ISPs open a privacy and security minefield.

The Great Firewall of Michigan

HB 4938 is channeling its inner China with the implementation mandate for platforms and ISPs. It requires private ISPs to do the government’s bidding by spying on its citizens. It will whittle away security and privacy in the name of public morality and cries of “oh, think of the children!” Quite literally:

These measures defend children, safeguard our communities, and put families first.

Who maintains the list of known circumvention tools? No way that list won’t be overbroad because there can’t be a legitimate use of VPNs other than through violations of public morality.

Technically speaking, how do you block VPNs? Common port numbers? Easily mitigated by changing to uncommon ports. VPN server-side IPs? No way that list stays up to date. There is no silver bullet for blocking VPNs at the ISP level.

Virtually every company in the US uses VPNs, proxys, technology that could be classified as “circumvention technology.” Cloudflare, Zscaler, ZeroTeir, Tailscale, et al. will need to be whitelisted for Michigan companies to give users access to the Internet. And that’s just the big cloud providers. Will every small business and remote worker in Michigan need to whitelist their IPs for site-to-site and remote VPN access? Good luck for remote workers at home with ISPs handing out IP addresses via DHCP.

Better yet, how about we just break the encryption so that state can see into all the traffic and catch the content criminals? Because there’s really no way to be sure without inspecting every user’s traffic.

That’s where this is headed again. That’s what Congress tried in 2020 with the EARN IT Act. Or, Clipper Chips in the 90s.

Bills like this are bad ideas from the start and make technology so dangerous for individual and enterprise security and privacy.

Easier traffic inspection = no privacy online.

VPNs banned = no security on lower trust networks and a nightmare of whitelisting the “good ones.”

Of course, there’s artificial intelligence in it…

No techno-mumbo-jumbo bill would be complete without at least some AI worked in there somewhere.

HB 4938 does not disappoint with its content moderation requirements for platforms to use “artificial intelligence driven filtering technology for preemptive removal of prohibited material.”

Awesome. We can just feed AI the definition prohibited content and expect it to implement this for us.

This fails to pass First Amendment muster, right?

To anyone with an honest reading of First Amendment jurisprudence, this bill goes nowhere. And if it did, it gets struck down with no hope of becoming an enforceable law. Right?

If something like this sees the light of day, watch out for the definition of “prohibited content” to creep. Hate speech. Extreme viewpoints. Disorderly speech.

And, whoever holds power will define what is acceptable content for the masses.

Nothing was your own except the few cubic centimetres inside your skull.
-George Orwell, 1984

Filed Under: Cybersecurity, Surveillance Tagged With: age verification, Cybersecurity, Encryption, privacy, security, vpn

DSAR Metric Tracking for Privacy Programs

September 7, 2025 by Eric Reagan Leave a Comment

If you’re involved in running a privacy program, you likely already monitor some data related to consumer requests. The recent enforcement kerfluffle between the Connecticut AG and TicketNetwork highlighted a few metrics that should be on your list if you aren’t already tracking them.

What happened?

On November 9, 2023, the Connecticut AG sent a cure notice to TicketNetwork, which is an online marketplace for buying and selling live event tickets, for violations of the Connecticut Data Privacy Act (CTDPA).

The cure notice flagged “deficiencies in the company’s privacy notice and [gave] the company the chance to come into compliance without penalty. In particular, the company’s privacy notice was largely unreadable, missing key data rights, and contained rights mechanisms that were misconfigured or inoperable. Under the CTDPA’s cure period, TicketNetwork had 60 days— until January 8, 2024— to resolve each deficiency.”

  • December 31, 2023: TicketNetwork responded to the Connecticut AG that it had cured its deficiencies under the CTDPA.
  • January 8, 2024: Cure period expired with uncured deficiencies from AG’s perspective.
  • February 2, 2024: AG sent another letter outlining the deficiencies with a deadline to respond by March 1, 2024.
  • March 1, 2024: Response deadline passed with no communication by TicketNetwork.
  • March 12, 2024: The AG sent a follow-up and received no response.
  • April 16, 2024: The AG contacted TicketNetwork yet again to ask when it would respond, and the company responded the same day with a link to its updated privacy notice, which it represented was now in compliance.
  • June 17, 2024: The AG sent another letter to TicketNetwork outlining the continued deficiencies and gave a July 2, 2024 deadline.
  • June 24, 2024: TicketNetwork responded that it was updating its privacy notice to address deficiencies and asked for extension to July 31, 2024. AG declines extension.
  • November-December 2024: TicketNetwork continued to address deficiencies regarding CTDPA compliance.
  • May 29, 2025: TicketNetwork, through its CFO, enters into an Assurance of Voluntary Compliance (AVC) with the Connecticut AG, outlining specific tasks it will perform to comply with CTDPA.

The AVC offers some general compliance requirements regarding consumer-facing privacy notice and DSAR submissions, which track closely with statutory language. Additionally, it outlined some problems to avoid that were tailored to TicketNetwork’s issues around formatting and technical implementation. Notably, “TicketNetwork shall not publish a privacy notice that:

  1. uses large blocks of text that consumers are unlikely to read;
  2. uses small font that is difficult to read;
  3. uses unnecessarily complicated language, including legal or technical jargon;
  4. uses mechanisms that make it difficult for a consumer to exercise their consumer rights, such as by requiring unnecessary steps or by using confusing interfaces or forms.”

Privacy Notice Review and DSAR Metrics

In order to ensure that TicketNetwork maintained ongoing compliance, the Connecticut AG included some privacy program governance and tracking requirements to the AVC.

Specifically, the AG required TicketNetwork to “regularly review and revise its public-facing privacy notice to reflect TicketNetwork’s data collection and processing activities. This review shall be conducted on at least an annual basis and upon any material change to its privacy practices.” (emphasis added.)

Additionally, the AG included reporting requirements with TicketNetwork’s first report due within 180 days. The report must document the consumer rights requests that TicketNetwork receives and then break down each category of request (e.g., right to access, right to delete, etc.) with the following metrics:

  1. the number of requests received;
  2. the mode by which they were received (e.g., by e-mail);
  3. the average length of time taken to complete the requests;
  4. whether the requests were fulfilled or rejected and, if rejected, the reason for the rejection;
  5. the number of appeal requests received;
  6. the average length of time taken to respond to the appeals; and
  7. whether the appeal requests were granted or denied and, if denied, the reason for the denial.

The AVC goes on to require TicketNetwork to maintain regular monitoring of these metrics, which are to made available to the AG upon request in the future.

For folks working in privacy programs, if you’re not already tracking all of the above metrics, it sure seems like a good time to start. All of these actions support a minimally-functioning privacy program. Given that they matter to an AG’s office enforcing an active US comprehensive privacy law, this should be low-hanging fruit to adopt.

NIST Privacy Framework Mapping

As a bonus, if you follow the NIST Privacy Framework 1.0, those metrics from the Connecticut AG all map directly to the Govern function in the Monitoring and Review category, GV.MT-P7 (Policies, processes, and procedures for receiving, tracking, and responding to complaints, concerns, and questions from individuals about organizational privacy practices are established and in place.)

Likewise, the requirement to annually (or upon material change) review and revise the privacy notice maps to CM.PO-P1 (Transparency policies, processes, and procedures for communicating data processing purposes, practices, and associated privacy risks are established and in place.) and GV.MT-P2 (Privacy values, policies, and training are reviewed and any updates are communicated.)

You can read the full Assurance of Voluntary Compliance below.

TicketNetwork Assurance of ComplianceDownload

Filed Under: US Privacy Law Tagged With: Connecticut, CTDPA, State Privacy Laws, US Privacy Law

19 Laws and Counting . . . by way of The Brussels Effect

January 5, 2025 by Eric Reagan Leave a Comment

2025 is kicking off with several new comprehensive state privacy laws taking effect – with Delaware, Iowa, Nebraska, New Hampshire and New Jersey’s laws all taking effect in January and a total of 19 state laws either in effect or a countdown ticking to their effective dates.

While many of these laws look and feel like copycats of other state laws that are already in effect, the real catalyst for each of these laws can be traced back to EU privacy regulation (with the 2018 effective date of GDPR being the biggest driver). This is the Brussels Effect at work and on full display in the US privacy patchwork. If you haven’t read Anu Bradford’s book The Brussels Effect and you work in privacy, it’s worth adding to your reading list this year. I read it a couple years ago and have looked at state privacy laws and industry lobbying different since.

In the book, Bradford breaks down just how the EU drives privacy globally through both the de facto Brussels Effect (companies adopting practices that comply with EU privacy regulation on a global scale) and the de jure Brussels Effect (when jurisdictions outside the EU embrace GDPR-like comprehensive privacy laws). While The Brussels Effect is applied to other regulations in the book, the five key elements make privacy regulation a perfect storm for “unilateral regulatory globalization.”

  1. Market Power
  2. Regulatory Capacity
  3. Stringent Standards
  4. Inelastic Targets
  5. Non-Divisibility

Market Power

The EU’s population of roughly 450 million people makes up a substantial market force as global companies seek to offer their goods and services to this massive market. As a result, the regulatory power of the EU reaches broadly around the globe to impact the compliance and governance of US and other third countries’ firms. 

Regulatory Capacity and Stringent Standards

The EU’s “[r]egulatory capacity refers to [its] ability to promulgate and enforce regulations,” and is “often closely associated with . . . the propensity to promulgate stringent rules.” The capacity to regulate privacy by a relatively small regulatory body like the EU is bolstered by the delegation of GDPR enforcement to EU members states. The preference for stronger privacy rights that stem from the EU’s desire to treat privacy as a human right and offer stringent protections for its citizens through regulation. See The Brussels Effect at 41 (quoting Commission President Jean-Claude Junkcer, “I will not sacrifice Europe’s safety, health, social and data protection standards . . . on the altar of free trade.”)

Inelastic Targets

Bradford’s inelastic target, as it relates to privacy law regulation, is demonstrated through “[t]he inelastic nature of consumer markets[, which] does not leave producers with a choice regarding the jurisdiction; they cannot ‘shop’ for favorable regulations without losing access to the regulated market.” The Brussels Effect at 48.

Non-Divisibility

Finally, the non-divisibility of global privacy regulation is demonstrated by the inefficiency to embed data protection tools, technology, and processes for different jurisdictions on a global scale. Of course, there are situations where tech companies use local cloud environments for EU data subjects; however, it’s harder to scale product features and teams across the entire tech stack than it is to embrace a data protection scheme that works across the globe. Additional fallout from that non-divisibility encourages global firms to lobby for consistent regulations that will impact local competition in the US and other third countries.

19 Laws and Counting

While California was a bit of a wildcard from a lobbying and drafting perspective, industry has continued to lean in to influence everything from the failed Washington Privacy Act that was reborn in Virginia and every other state along the way for a thematically consistent GDPR-lite framework with a regulatory lineage back to Brussels.

The European Union recently updated its privacy law through the passage and implementation of the general data protection regulation, affording its residents the strongest privacy protections in the world. Washington residents deserve to enjoy the same level of robust privacy safeguards.

– from Legislative Findings in the failed Washington Privacy Act (S. 5376, 66th Leg., 2019 Reg. Sess. (Wa. 2019)).

The Brussels Effect is full steam ahead in the US privacy landscape and we’re sure to see more state privacy laws passed in 2025. And buckle up for the upcoming deluge of AI regulation to accompany the privacy patchwork – brought to you by The Brussels Effect!

Filed Under: US Privacy Law Tagged With: Brussels Effect, privacy, State Privacy Laws, US Privacy Law

Tennessee Information Protection Act Passes House on a 90-0 Vote

April 10, 2023 by Eric Reagan Leave a Comment

The Tennessee Information Protection Act (TIPA) passed the House (HB1181) today on a 90-0 vote. The TIPA version that passed (virtually the same as the previously-discussed amended Senate bill) looks to be the most business-friendly state privacy law to date.

SB0073 is scheduled for a vote in the Senate later this week, which is also expected to easily pass. As a result, it’s likely that Tennessee will be the next state with a privacy law within the next week or so.

A quick summary of TIPA:

  • Effective July 1, 2025
  • Applies to businesses that have $25M+ in annual revenue AND process the personal info of at least (1) 175,000 consumers; or (2) 25,000 consumers if they derive 50% of their revenue from the sale of personal data.
  • Consumer does not include a person acting in commercial/employment context 
  • Sale of personal info requires “monetary” consideration
  • Personal information is “information that is linked or reasonably linkable to an identified or identifiable natural person” and excludes publicly available or de-identified consumer data
  • Consumer rights include:
    • Right to know
    • Right to access
    • Right to correct
    • Right to delete
    • Right to portability
    • Right to opt-out of sale, profiling, and targeted ads
  • Data controller responsibilities include:
    • Transparency requirement
    • Purpose limitation requirement
    • Secondary use prohibition
    • Data security requirement
    • Nondiscrimination policy
    • Sensitive data additional consent
  • Privacy Notice must include:
    • The categories of personal information processed by the controller
    • The purpose for processing personal information
    • How consumers may exercise their consumer rights
    • The categories of personal information that the controller sells to third parties
    • The categories of third parties, if any, to whom the controller sells personal information
    • The right to opt out of the sale of personal information to third parties and the ability to request deletion or correction of certain personal information.
  • Time to Respond to DSAR = 45 days + 45 additional days “when reasonable necessary”
  • Consumer has right to appeal with 60-day response required
  • No private right of action
  • TN Attorney General has sole enforcement authority
  • Up to $7500 fine per violation
  • Businesses have 60-day cure period
  • Data Processing Agreements must include:
    • Processing instructions
    • Nature and purpose of processing
    • Type of data subject to processing
    • Duration of processing
    • Rights and obligations of both parts
    • Duty of confidentiality with respect to the data
    • Processor duty to delete or return data at controller’s request
    • Assist with compliance obligations of controller for data in processor’s possession
    • Flowdown of processor requirements to its subcontractors
  • Data Protection Assessments required when:
    • Processing for targeted advertising
    • Sale of personal information
    • Profiling, when there is a risk of:
      • unfair or deceptive treatment / unlawful disparate impact
      • financial, physical, or reputational injury
      • invasion of privacy
      • other substantial injury
    • Sensitive data processing
    • Any processing with a heightened risk of harm
  • Affirmative defense for businesses with privacy program that reasonably conforms to NIST Privacy Framework or other documented policies, standards, and procedures designed to safeguard privacy

Filed Under: US Privacy Law Tagged With: NIST, privacy, Privacy Framework, State Privacy Laws, Tennessee, Tennessee Information Protection Act, TIPA, US Privacy Law

Tennessee Information Protection Act (TIPA) Amended Bill Headed to Senate Floor

March 26, 2023 by Eric Reagan Leave a Comment

Last week, the Tennessee Senate Commerce and Labor Committee recommended the Tennessee Information Protection Act for passage with amendments as it sent the bill (SB0073) to the floor.

There have been substantial amendments to the bill since its introduction in January and entire sections have been rewritten. Many of the changes were directed at cleaning up confusing or ambiguous phrasings and the overall changes have been more business-friendly. The effective date was also pushed from July 1, 2024 to July 1, 2025.

Below, I hit the highlights but consider my prior post that covered the entire bill along with my comments and criticisms on particular sections.

[Read more…] about Tennessee Information Protection Act (TIPA) Amended Bill Headed to Senate Floor

Filed Under: US Privacy Law Tagged With: NIST, privacy, Privacy Framework, State Privacy Laws, Tennessee, Tennessee Information Protection Act, TIPA, US Privacy Law

  • Page 1
  • Page 2
  • Page 3
  • Go to Next Page »

Copyright © 2025 · DatavPrivacy.com