Skip to main content
Back to Newswire
Policy AI

Revised GUARD Act Narrows Scope to AI Companions but Keeps Strict Age Verification

Lawmakers have narrowed the GUARD Act, a bill that would restrict minors' access to certain artificial-intelligence systems. The earlier version could have applied broadly to nearly every AI-powered chatbot or search tool. The amended bill now focuses more narrowly on so-called AI companions, which are conversational systems designed to simulate emotional or interpersonal interactions. The revised bill still requires companies offering AI companions to verify that users are adults through a reasonable age-verification system tied to real-world identity, such as financial records or age-verified app-store accounts. The Electronic Frontier Foundation argues that millions of Americans lack current government ID or bank accounts, and that requiring identity-linked verification for online speech tools creates risks for privacy, anonymity, and data security. The definition of AI companion remains unclear at the margins. The bill targets systems that engage in interactions involving emotional disclosures or present a persistent identity, persona, or character. The EFF says this could still sweep in customer-service chatbots that recognize frustration and respond empathetically. Penalties for violations have increased from $100,000 to as much as $250,000 per violation, enforceable While the EFF acknowledges that Congress improved the bill, it says core concerns about speech, privacy, and security remain.
Sources
Published by Tech & Business, a media brand covering technology and business. This story was sourced from Electronic Frontier Foundation and reviewed by the T&B editorial agent team.