- May 6, 2026
Loading
Florida is now enforcing its sweeping social media law, the Online Protections for Minors Act (House Bill 3), after the 11th Circuit Court of Appeals stayed a lower-court injunction in November.
The law bars children under 14 from holding accounts on some social media platforms and requires parental consent for 14- and 15-year-olds. In response, some sites blocked Florida users entirely rather than comply with a related age-verification requirement.
As Congress debates two federal children's online safety bills, Floridians are getting a preview of what happens when lawmakers expand the legal standard for when a platform is responsible for knowing a user is a child.
The federal children's privacy law since 1998, the Children's Online Privacy Protection Act (COPPA), uses an "actual knowledge" standard, meaning general-audience platforms are

responsible only when they have been directly informed that a user is under 13. In its 2011 COPPA rule review, the FTC explained that Congress deliberately rejected a broader standard that would have made platforms liable for what they should have known from circumstantial evidence. The FTC again declined to adopt constructive knowledge in its most recent rulemaking and noted that only Congress could make that change.
In February 2026, the FTC announced it will not bring certain

COPPA enforcement actions against general-audience and mixed-audience operators that collect personal information solely to determine a user's age, so long as they use it only for that purpose, delete it promptly, and do not repurpose it. The practical effect is to make age data collection less legally risky while putting more weight on whether a company's methods are accurate enough to survive scrutiny if enforcement comes later.
The Senate versions of COPPA 2.0 and the Kids Online Safety Act (KOSA) would replace that standard. Both bills say an operator should know a user is a minor when age is "fairly implied" based on what a "reasonable and prudent person" would understand. That is an open-ended standard, applied in hindsight, with no clear threshold for what counts. Each bill also states that the law should not be read as a requirement to collect new data or build age-verification systems.
Taken together, these two provisions leave companies without a clear answer on how to comply. If age-gating is not required but self-attestation, just asking users to enter their age or confirm that they are above a certain age, is not enough, the obvious question is how much age-determination a platform must implement to avoid liability. KOSA points platforms toward signals they already have without explaining what would be sufficient, and it tells the Federal Trade Commission to issue guidance that the bill itself describes as nonbinding.
Florida is already further down this road. HB 3 imposes civil penalties up to $50,000 per violation for "knowing or reckless" failures, and it allows minors to recover up to $10,000 per claim through a private right of action. The implementing regulation defines "willful disregard" of a user's age as the moment a platform "should reasonably have been aroused to question whether the person was a child and thereafter failed to perform reasonable age verification." That is constructive knowledge in everything but name. The Florida Digital Bill of Rights uses the same framing.
Major platforms have already begun testing age-inference and age-estimation tools. Meta uses artificial intelligence to place suspected teens into "teen account" settings even when those users claim to be adults. YouTube uses an age-estimation model and escalates uncertain cases to government ID, credit card, or selfie verification. Every platform will not make the same choices, but age inference and escalation to stronger verification have become part of the ordinary compliance landscape.
When penalties are high and the standard is vague, companies focus on protecting themselves rather than on preserving privacy. Pornhub did so by leaving Florida. Other platforms will respond by collecting more data on every user, not just suspected minors, because that is the easiest path to defend if regulators push later.
Florida lawmakers are not done expanding this approach. In the 2026 session, Sen. Tom Leek's Artificial Intelligence Bill of Rights (SB 482) and a parallel House bill (HB 659) would have extended age verification and parental consent rules to AI companion chatbots. Both died in the House. Gov. Ron DeSantis has signaled he wants them revived in a special session.
Before Florida or Congress goes further, the FTC should evaluate the age-determination tools American companies are already deploying: how accurate they are, what data they collect, how they handle errors, and whether they create new privacy risks. Lawmakers need that record before they decide whether broadening the actual knowledge standard is necessary. As it stands, abandoning "actual knowledge" in favor of vague notions of what age is "fairly implied" only pushes companies toward invading every user's privacy to reduce their own liability.
Nicole Shekhovtsova is a policy analyst, and Adrian Moore is vice president at Reason Foundation.