2025 was the year age verification went from a fringe policy experiment to a sweeping reality across the United States. Half of the U.S. now mandates age verification for accessing adult content or social media platforms. Nine states saw their laws take effect this year alone, with more coming in 2026.
The good news is that courts have blocked many of the laws seeking to impose age-verification gates on social media, largely for the same reasons that EFF opposes these efforts. Age-verification measures censor the internet and burden access to online speech. Though age-verification mandates are often touted as "online safety" measures for young people, the laws actually do more harm than good. They undermine the fundamental speech rights of adults and young people alike, create new barriers to internet access, and put at risk all internet users' privacy, anonymity, and security.
If you're feeling overwhelmed by this onslaught of laws and the invasive technologies behind them, you're not alone. That's why we've launched EFF's Age Verification Resource Hub at EFF.org/Age—a one-stop shop to understand what these laws actually do, what's at stake, why EFF opposes all forms of age verification, how to protect yourself, and how to join the fight for a free, open, private, and safe internet. Moreover, there is hope. Although the Supreme Court ruled that imposing age-verification gates to access adult content does not violate the First Amendment on its face, the legal fight continues regarding whether those laws are constitutional.
As we built the hub throughout 2025, we also fought state mandates in legislatures, courts, and regulatory hearings. Here's a summary of what happened this year.
The Laws That Took Effect (And Immediately Backfired)
Nine states’ age verification laws for accessing adult content went into effect in 2025:
- South Carolina (January 1)
- Florida (January 1)
- Tennessee (January 13)
- Georgia (July 1)
- Wyoming (July 1)
- North Dakota (August 1)
- Arizona (September 26)
- Ohio (September 30)
- Missouri (November 30)
Predictably, users didn’t stop accessing adult content after the laws went into effect, they just changed how they got to it. As we’ve said elsewhere: the internet always routes around censorship.
In fact, research from the New York Center for Social Media and Politics and the public policy nonprofit the Phoenix Center confirm what we’ve warned from the beginning: age verification laws don’t work. Their research found:
- Searches for platforms that have blocked access to residents in states with these laws dropped significantly, while searches for offshore sites surged.
- Researchers saw a predictable surge in VPN usage following the enactment of age verification laws, where for example, Florida saw a 1,150% increase in VPN demand after its law took effect.
As foretold, when platforms block access or require invasive verification, it drives people to sites that operate outside the law—platforms that often pose greater safety risks. Instead of protecting young people, these laws push them toward less secure, less regulated spaces.
Legislation Watch: Expanding Beyond “Adult Content”
Lawmakers Take Aim at Social Media Platforms
Earlier this year, we raised the alarm that state legislatures wouldn’t stop at adult content. Sure enough, throughout 2025, lawmakers set their sights on young people’s social media usage, passing laws that require platforms to verify users’ ages and obtain parental consent for accounts belonging to anyone under 18. Four states already passed similar laws in previous years. These laws were swiftly blocked in courts because they violate the First Amendment and subject every user to surveillance as a condition of participation in online speech.
Warning Labels and Time Limits
And it doesn’t stop with age verification. California and Minnesota passed new laws this year requiring social media platforms to display warning labels to users. Virginia’s SB 854, which also passed this year, took a different approach. It requires social media platforms to use “commercially reasonable efforts” to determine a user's age and, if that user is under 16, limits them to one hour per day per application by default unless a parent changes the time allowance.
EFF is opposed to these laws as they have serious First Amendment concerns. And courts have agreed: in November 2025, the U.S. District Court for the District of Colorado temporarily halted Colorado's warning label law, which would have required platforms to display warnings to users under 18 about the negative impacts of social media. We expect courts to similarly halt California and Minnesota’s laws.
App Store and Device-Level Age Verification
2025 also saw the rise of device-level and app-store age verification laws, which shift the obligation to verify users onto app stores and operating system providers. These laws seriously impact users’ (adults and young people alike) from accessing information, particularly since these laws block a much broader swath of content (not only adult or sexual content), but every bit of content provided by every application. In October, California Governor Gavin Newsom signed the Digital Age Assurance Act (AB 1043), which takes a slightly different approach to age verification in that it requires “operating system providers”—not just app stores—to offer an interface at device/account setup that prompts the account holder to indicate the user’s birth date or age. Developers must request an age signal when applications are downloaded and launched. These laws expand beyond earlier legislation passed in other states that mandate individual websites implement the law, and apply the responsibility to app stores, operating systems, or device makers at a more fundamental level.
Again, these laws have drawn legal challenges. In October, the Computer & Communications Industry Association (CCIA) filed a lawsuit arguing that Texas’s SB 2420 is unconstitutional. A separate suit, Students Engaged in Advancing Texas (SEAT) v. Paxton, challenges the same law on First Amendment grounds, arguing it violates the free speech rights of young people and adults alike. Both lawsuits argue that the burdens placed on platforms, developers, and users outweigh any proposed benefits.
From Legislation to Regulation: Rulemaking Processes Begin
States with existing laws have also begun the process of rulemaking—translating broad statutory language into specific regulatory requirements. These rulemaking processes matter, because the specific technical requirements, data—handling procedures, and enforcement mechanisms will determine just how invasive these laws become in practice.
California’s Attorney General held a hearing in November to solicit public comment on methods and standards for age assurance under SB 976, the “Protecting Our Kids from Social Media Addiction Act,” which will require age verification by the end of 2026. EFF supported the legal challenge to S.B. 976 since its passage, and federal courts have blocked portions of the law from taking effect. Now in the rulemaking process, EFF submitted comments raising concerns about the discriminatory impacts of any proposed regulations.
New York's Attorney General also released proposed rules for the state’s Stop Addictive Feeds Exploitation (SAFE) for Kids Act, describing which companies must comply and the standards for determining users’ age and obtaining parental consent. EFF submitted comments opposing the age verification requirements in September of 2024, and again in December 2025.
Our comments in both states warn that these rules risk entrenching invasive age verification systems and normalizing surveillance as a prerequisite for online participation.
The Boundaries Keep Shifting
As we’ve said, age verification will not stop at adult content and social media. Lawmakers are already proposing bills to require ID checks for everything from skincare products in California to diet supplements in Washington. Lawmakers in Wisconsin and Michigan have set their targets on virtual private networks, or VPNs—proposing various legislation that would ban the use of VPNs to prevent people from bypassing age verification laws. AI chatbots are next on the list, with several states considering legislation that would require age verification for all users. Behind the reasonable-sounding talking points lies a sprawling surveillance regime that would reshape how people of all ages use the internet. EFF remains ready to push back against these efforts in legislatures, regulatory hearings, and court rooms.
2025 showed us that age verification mandates are spreading rapidly, despite clear evidence that they don't work and actively harm the people they claim to protect. 2026 will be the year we push back harder—like the future of a free, open, private, and safe internet depends on it.
This is why we must fight back to protect the internet that we know and love. If you want to learn more about these bills, visit EFF.org/Age
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.







