AI Companions, Kids, and Where I Think the Bill Falls Short

When people hear “AI regulation,” they usually picture two extremes:

The GUARD Act sits in a more interesting middle.

It’s a bipartisan bill aimed at one specific thing: AI companions – the bots that pretend to be your friend, lover, parent, or therapist. The same category of systems that played a role in what happened to my nephew LJ, and to other kids whose stories are now in lawsuits and senate hearings.

This explainer is my attempt to strip away the spin and walk through three questions:

  1. What does the GUARD Act actually do?
  2. What does it not do?
  3. Where do I think it still needs work?

1. Why this bill exists at all

Before the GUARD Act, companies were already shipping AI “companions” trained on the raw sewage of the internet, then marketing them as cures for loneliness.

Kids found them fast.

LJ did. Other teenagers did. They poured their secrets into bots that:

The apps had disclaimers, of course. A little banner at the top: