New US House privacy bills raise hard questions about enterprise data collection

0
9

US House Republicans have introduced two major privacy proposals that would reshape how US companies collect, process, and retain consumer data: the SECURE Data Act for general consumer privacy and the GUARD Financial Data Act for financial institutions.

The bills would create national standards for privacy and security practices while broadly preempting many state privacy laws, including the stronger protections already in place in states like California and Maryland. They also would eliminate the possibility of private lawsuits under the federal framework, leaving enforcement primarily to the Federal Trade Commission and state attorneys general.

That combination of federal preemption, weaker enforcement, and broad compliance changes has made the bills politically toxic for Democrats and privacy advocates alike. The Electronic Privacy Information Center (EPIC) called the SECURE Data Act “a huge gift to Big Tech” and warned that “a weak federal standard is worse than no standard at all.”

Congress has spent more than a decade failing to produce a comprehensive federal privacy law, often under less polarized conditions than today. “Support from one party in a single house of Congress is not going to do it,” said Alan Butler, executive director and president of EPIC. “It takes a bipartisan process, actually, to pass substantive legislation like this.”

The bills matter because they expose the privacy issues enterprises are already being forced to confront under existing state laws and federal guidance, including data minimization, automated profiling, data broker accountability, and increasingly complex rules around sensitive data.

Why data minimization is becoming a business issue

The SECURE Data Act includes familiar privacy rights: access, correction, deletion, portability, opt-outs for targeted advertising and data sales, and restrictions on certain forms of automated profiling. It also creates a federal data broker registry and formal controller-processor obligations for companies and vendors.

The most consequential operational issue for enterprises, however, is data minimization, the increasingly accepted principle that companies should collect only what they need, retain it only as long as necessary, and be able to justify both decisions.

The National Institute of Standards and Technology (NIST) already treats minimization as a core privacy and security principle. In its “Collection and Data Minimization” guidance, the agency says organizations should collect only the personal information necessary for a stated purpose, because excess retention creates avoidable privacy and security risks

That principle is increasingly central to state privacy laws as well. California and Maryland both impose stronger restrictions on unnecessary collection and retention than many earlier state frameworks.

For CIOs, CISOs, and CFOs, this is not simply a privacy-notice issue. Dormant customer records, excessive telemetry, forgotten SaaS archives, oversized AI training datasets, and legacy marketing databases all increase breach exposure. The more unnecessary data a company stores, the larger its attack surface becomes.

Butler argues the SECURE bill’s own minimization language is weaker than what leading states already require.

“The answer is that the law doesn’t really do anything,” he said, because the provision largely ties collection limits to what companies disclose in their privacy policy rather than imposing a stronger necessity standard.

That creates an unusual enterprise dynamic: the bill could weaken privacy protections overall while still reinforcing the long-term expectation that companies must be able to justify why they keep the data they have.

Where privacy law overlaps with AI governance

The SECURE Data Act does not contain broad, standalone AI governance rules, but it still touches AI in meaningful ways.

The bill includes opt-outs for fully automated profiling used for decisions with legal or similarly significant effects. That language can clearly implicate some uses of AI, particularly in hiring, lending, insurance, and other high-impact decisions.

Butler said that the profiling provision is worth watching, because several state laws already include similar requirements, and the concept is expanding.  That means privacy law may become the first practical form of AI regulation for many enterprises.

Training datasets, customer prompts, telemetry collection, and retention periods all become harder to defend when regulators ask whether the data is truly necessary. Legal teams, privacy officers, and CISOs may find themselves shaping AI strategy well before Congress passes a standalone AI law.

The teen-data provision that could break everything

One of the least-discussed but most disruptive provisions in the SECURE Data Act involves teens.

It states that a controller, namely any entity that is processing personal data, may not process the sensitive data of a teen without obtaining verifiable parental consent. The problem is that the bill defines sensitive data to include personal data collected from a teen, meaning almost any interaction involving a known user between 13 and 15 years old could trigger the requirement.

“If you operate a website, an app, a service, and there are users you know who are between 13 and 15, it’s going to break everything,” Butler said. “You’re going to have to get verifiable parental consent every time you touch the data—collect it, transfer it, store it, process it, anything.”

To comply, companies would need not only age awareness, which many already have through account creation or app stores, but also a system for verifying parent-child relationships. That would likely require collecting additional sensitive identity documents and personal records, the exact kind of information most organizations should try to avoid storing.

“It doesn’t work. It doesn’t make sense,” Butler said. “If I were a CIO or CISO, I would be very concerned, because it is completely unworkable.”

Why vendors and data brokers matter more

The SECURE bill requires formal controller-processor contracts covering confidentiality, deletion, retention limits, subcontractor obligations, and other safeguards. That pushes privacy compliance directly into procurement and third-party risk management.

For companies with inherited vendors from acquisitions and unclear data ownership, privacy compliance becomes an exercise in figuring out who has what data, where it sits, and whether anyone can actually force its deletion.

Butler points to the federal data broker registry as one of the few provisions in the bill that clearly reflects where privacy law is already moving. More states are adopting registry requirements, and businesses increasingly have to evaluate what data they buy, where it came from, and whether they qualify as brokers themselves.

Why financial firms should watch GUARD more closely

While the SECURE Data Act affects far more enterprises, the companion GUARD Financial Data Act may matter more for banks, insurers, and fintechs.

Rather than creating an entirely new framework, GUARD would significantly modernize Title V of the Gramm-Leach-Bliley Act (GLBA). It would preserve consumer opt-out rights while expanding access rights, adding former-customer deletion rights, requiring affirmative opt-in consent before disclosure of sensitive personal information, and imposing stronger obligations around financial data aggregators and access credentials.

It also includes provisions around data processing involving “covered nations,” tying financial privacy more directly to national security and supply chain concerns.

For institutions that still treat GLBA privacy notices as an annual compliance exercise, GUARD would turn privacy into a daily operational issue, touching open banking, credential handling, vendor relationships, and retention practices for former customers.

A federal law may not be the best for business

Business groups have long wanted a single national standard to replace the state-by-state patchwork of privacy laws. One federal framework is easier to govern than 20 competing ones, and many companies would welcome the predictability.

Brendan Thomas, executive director of the Internet for Growth coalition, praises the SECURE Data Act for providing a federal framework against the patchwork of state laws, which the coalition says is driving up prices for small businesses. “The introduction of the SECURE Data Act (H.R. 8413) in the House is an important step in the ongoing effort to establish a national privacy framework,” Thomas said in a statement.

But EPIC’s Butler argues that wiping away stronger state laws may not actually be good for business.

He says that companies have already invested heavily in compliance programs around those frameworks, and replacing them with weaker, vaguer federal rules could create new uncertainty rather than less.

“It breeds distrust among your customers,” Butler said. “It’s not good for business for people to mistrust what’s happening with these apps. All of a sudden, [consumers] don’t feel like their privacy is protected anymore.”

– Read more